The old model of ‘macroeconomics’ was built for a stable world free from large economic shocks. If we ever lived in such a world, we no longer do

21st century crises demand new economic understanding, say top economists

Leading economists, including Nobel laureate Joseph Stiglitz, Argentina's Minister of Economy Martin Guzman, as well as academics from Oxford, Yale, Columbia, and UCLA, are calling today for a deep shift in how economists understand the overall economy. According to the new thinking, a series of massive economic shocks have left traditional economic theory in pieces and the old macroeconomic paradigm is on its way out.  

Oxford Professor David Vines says, ‘The old model of ‘macroeconomics’ was built for a stable world free from large economic shocks. If we ever lived in such a world, we no longer do.’

The old model of ‘macroeconomics’ was built for a stable world free from large economic shocks. If we ever lived in such a world, we no longer do Professor David Vines

Summarising the Rethinking Macroeconomics project in the  Oxford Review of Economic Policy , Oxford economists Professor David Vines and Dr Samuel Wills call for a shift away from the assumptions which have underpinned economic theory for decades – and which do not meet today’s challenges.  They argue a more open, more diverse, paradigm is emerging, which is far better equipped to deal with contemporary challenges such as the global financial crisis, climate change and COVID. 

Professor Vines and Dr Wills argue that the old macroeconomic paradigm is being replaced by an approach which is less tied to idealised theoretical assumptions, takes real-world data more seriously,  and is, therefore, far better suited to dealing with 21 st century challenges.  

Dr Wills says, ‘This shift [in thinking] breaks with more than a century of macroeconomic thinking and has wide-ranging implications for economic thought and practice.’

This shift [in thinking] breaks with more than a century of macroeconomic thinking and has wide-ranging implications for economic thought and practice Dr Samuel Wills

Professor Vines adds, ‘An economic model that cannot handle serious shocks is like a medical science that does not study major disease outbreaks; it is likely to let us down at the most critical of moments. The old economic model has already failed us badly in the 21 st century’.

Since the 2008 global financial crisis, which was not seen coming by most macroeconomists, the field has undergone a difficult decade. Its reputation was not helped by the slow and weak recovery from the 2008 crisis, which most observers now agree was made worse by world-wide austerity policies. 

Macroeconomics has been dominated by one core approach for the last two decades: the ‘New Keynesian Dynamic Stochastic General Equilibrium’ (NK-DSGE) model. While this remains the profession’s generally accepted framework, the contributors to this issue of the  Oxford Review of Economic Policy  argue it is no longer fit for purpose.

According to today’s journal issue, the NK-DSGE model is unsuited to understanding large economic shocks. At the heart of the old model is the assumption that all disturbances are temporary and the economy eventually returns to one stable ‘equilibrium’, in which it continues to grow. This assumption makes it badly unsuited to studying large crises and transitions which see unemployment multiply and financial systems in crisis.

Professor Vines insists, ‘The old core assumption that the economy returns to one equilibrium point is a major diversion from reality. If we want an economics up to the challenges of the twenty first century, it is crucial the discipline understands the possibility of multiple economic equilibria.’

The old core assumption that the economy returns to one equilibrium point is a major diversion from reality...it is crucial the discipline understands the possibility of multiple economic equilibria Professor Vines

The NK-DSGE model is also built on over-idealised foundations. The old model is built on a set of assumptions about how people act in the economy and why: it assumes people are always well informed, rational, and dedicate all their attention and effort towards one particular goal.  

Dr Wills maintains, ‘This analytical straight-jacket means the old paradigm refuses to recognise that we all act under uncertainty about the future when we make economic decisions, and that our decisions are therefore always shaped by our best guesses, habits, and rules of thumb.’   

Professor Vines and Dr Wills have named the newly-emerging macroeconomic paradigm ‘MEADE’: Multiple Equilibrium and DiversE (in recognition of the Nobel prize-winning economist James Meade). The new approach studies how multiple economic equilibria can arise and uses wide range of different kinds of models to understand what policymakers can do.  

Professor Vines says, ‘This new paradigm has far deeper roots in the real world: it stresses the need for models based on detailed empirical understandings of how the economy actually is, rather than how it theoretically should be.  

‘Just as doctors only build up a full understanding of how a body is functioning using a whole host of empirically-grounded diagnostic tools, so economists must build up a full understanding of how economies function using a wide range of empirically-grounded tools and models.

‘Placing too much weight on the old, overly idealised model has blinkered macroeconomics; the blinkers are now coming off, and we want to speed this change along.’ 

Subscribe to News

DISCOVER MORE

  • Support Oxford's research
  • Partner with Oxford on research
  • Study at Oxford
  • Research jobs at Oxford

You can view all news or browse by category

A Journal of Ideas

Symposium | beyond neoliberalism, a new economic paradigm, tagged economics neoliberalism.

new economic theories assignment

Thomas Kuhn

In the fall of 2018, University of California, Berkeley economist Emmanuel Saez said, to an audience of economists, policymakers, and the press, “If the data don’t fit the theory, change the theory.” He was speaking about a new data set he developed to show who gains from economic growth, the rise in monopoly and monopsony power, and the resulting importance of policies such as wealth taxes and antitrust regulation. As we in the crowd listened to his remarks, I could tell this was an important moment in the history of economic thought. Saez had fast become one of the most respected economists in the profession—in 2009, he won the John Bates Clark Medal, an honor given by the economics profession to the top scholar under the age of 40, for his work on “critical theoretical and empirical issues within the field of public economics.” And here he was, telling us that economics needs to change.

Saez is not alone. The importance of his comments is reflected in the research of a nascent generation of scholars who are steeped in the new data and methods of modern economics, and who argue that the field should—indeed, must—change. The 2007 financial meltdown and the ensuing Great Recession brought to the forefront a crisis in macroeconomics because economists neither predicted nor were able to solve the problem. But a genuine revolution was already underway inside the field as new research methods and access to new kinds of data began to upend our understanding of the economy and what propels it forward. While the field is much more technical than ever, the increasing focus on what the discipline calls “heterogeneity”—what we can think of as inequality—is, at the same time, undermining long-held theories about the so-called natural laws of economics.

It’s not clear where the field will land. In 1962, the historian of science Thomas Kuhn laid out how scientific revolutions happen. He defined a paradigm as a consensus among a group of scientists about the way to look at problems within a field of inquiry, and he argued that the paradigm changes when that consensus shifts. As Kuhn said, “Men whose research is based on shared paradigms are committed to the same rules and standards for scientific practice. That commitment and the apparent consensus it produces are prerequisites for normal science, i.e., for the genesis and continuation of a particular research tradition.” In this essay, I’ll argue that there is a new consensus emerging in economics, one that seeks to explain how economic power translates into social and political power and, in turn, affects economic outcomes. Because of this, it is probably one of the most exciting times to follow the economics field—if you know where to look. As several of the sharpest new academics—Columbia University’s Suresh Naidu, Harvard University’s Dani Rodrik, and Gabriel Zucman, also at UC Berkeley—recently said, “Economics is in a state of creative ferment that is often invisible to outsiders.”

The Twentieth Century Paradigms

We can demarcate three epochs of economic thinking over the past century. Each began with new economic analysis and data that undermined the prevailing view, and each altered the way the profession examined the intersection between how the economy works and the role of society and policymakers in shaping economic outcomes. In each of these time periods, economists made an argument to policymakers about what actions would deliver what the profession considered the optimal outcomes for society. Thanks in no small part to the real-world successes of the first epoch, policymakers today tend to listen to economists’ advice.

The first epoch began in the early twentieth century, when Cambridge University economist John Maynard Keynes altered the course of economic thinking. He started from the assumption that markets do not always self-correct, which means that the economy can be trapped in a situation where people and capital are not being fully utilized. Unemployment—people who want to work but are unable to find a job—is due to firms not deploying sufficient investment because they do not see enough customers to eventually buy the goods and services they would produce. From this insight flowed a series of policy prescriptions, key among them the idea that when the economy is operating at less than full employment, only government has the power to get back to full employment, by filling in the gap and providing sufficient demand. Keynes’s contribution is often summarized to be that demand—people with money in their pockets ready to buy—keeps the economy moving. For economists, the methodological contribution was that policymakers could push on aggregate indicators, such as by boosting overall consumption, to change economic outcomes.

Keynes explicitly framed his analysis as a rejection of the prevailing paradigm. He begins his The General Theory by decrying the “classical” perspective, writing in Chapter 1 that “[T]he characteristics of the special case assumed by the classical theory happened not to be those of the economic society in which we actually live, with the result that its teaching is misleading and disastrous if we attempt to apply it to the facts of experience.” He spends the next chapter identifying the erroneous assumptions and presumptions of the prevailing economic analysis and lays out his work as a new understanding of the economy. In short, he argues they were wrong because they assumed that the economy always reverts to full employment.

Many credit the ideas he laid out in The General Theory with pulling our economy out of the depths of the Great Depression. He advanced a set of tools policymakers could use to ensure that the economy was kept as close to full employment as possible—a measure of economic success pleasing to democratically elected politicians. Certainly, the reconceptualization of the economy brought forth by National Income and Product Accounts—an idea Keynes and others spearheaded in the years between World War I and II—shaped thinking about the economy. These data allowed the government, for the first time, to see the output of a nation—gross domestic product (GDP)—which quickly became policymakers’ go-to indicator to track economic progress.

By the late 1960s, Keynes’s ideas had become the prevailing view. In 1965, University of Chicago economist Milton Friedman wrote an essay in which he said, “[W]e are all Keynesians now,” and, in 1971, even Richard Nixon was quoted in The New York Times saying, “I am now a Keynesian in economics.”

Nonetheless, the field was shifting toward a consensus around what has become known as “neoliberalism”—and Friedman was a key player. Keynes focused on aggregates—overall demand, overall supply—and did not have a precise theory for how the actions of individuals at the microeconomic level led to those aggregate trends. Friedman’s work on the permanent income theory—the idea that people will consume today based on their assessment of what their lifetime income will be—directly contradicted Keynes’s assumption that the marginal propensity to consume would be higher among lower-income households. Whereas Keynes argued that government could increase aggregate consumption by getting money to those who would spend it, Friedman argued that people would understand this was a temporary blip and save any additional income. The microfoundations movement within economics sought to connect Keynes’s analysis, which focused on macroeconomic trends—the movement of aggregate indicators such as consumption or investment—to the behavior of individuals, families, and firms that make up those aggregates. It reached its apex in the work of Robert Lucas Jr., who argued that in order to understand how people in the economy respond to policy changes, we need to look at the micro evidence.

Together, these arguments shifted the field back toward focusing on how the economy pushed toward optimal outcomes. What we think of in the policy community as “supply-side” policy was the focus on encouraging actors to engage in the economy. In contrast, demand-side management sought to understand business cycles and was important for recessions, which the Federal Reserve could fix using interest rate policy. In other words, we were back to assuming that the economy would revert to full employment and to what economists call “optimal” outcomes, if only the government would get out of the way.

As the United States neared the end of the twentieth century, there were many indications that this was the right economic framework. The United States led the world in bringing new innovations to market and, up until the late 1970s, America’s middle class saw strong gains year to year. We had avoided another crisis like the Great Depression and our economy drew in immigrants from across the globe. If policymakers focused on improving productivity, the market would take care of the rest—or so economists thought.

The Unraveling

By the end of the twentieth century, a cadre of economists had grown up within this new paradigm. In her recent book, Leftism Reinvented , Stephanie Mudge points to the rise of what she terms the “transnational finance-oriented economists” who “specialized in how to keep markets happy and reasoned that market-driven growth was good for everyone.” But behind the scenes, there was a new set of ideas brewing. For the market fundamentalist argument to be true, the market needs to work as advertised. Yet new data and methods eventually led to profound questions about this conclusion.

My introduction to this work came in the first week of my first graduate Labor Economics course in 1993. The professor focused on a set of then newly released research papers by David Card and Alan Krueger in which they used “natural experiments” (here, comparing employment and earnings in fast food restaurants across state lines before and after one state raised their minimum wage) to examine what happened when policymakers raised the minimum wage. This was an innovation. Prior to the 1990s, most research in economics focused on the model, not the empirical methods. Indeed, Card recently told the Washington Center for Equitable Growth’s Ben Zipperer, “In the 1970s, if you were taking a class in labor economics, you would spend a huge amount of time going through the modeling section and the econometric method. And ordinarily, you wouldn’t even talk about the tables. No one would even really think of that as the important part of the paper. The important part of the paper was laying out exactly what the method was.”

This was not only an interesting and relevant policy question; it also was a deeply theoretical one. Standard economic theory predicts that when a price rises, demand falls. Therefore, when policymakers increase the minimum wage, employers should respond by hiring fewer workers. But Card and Krueger’s analysis came to a different conclusion. Using a variety of data and methods—some relatively novel—they found that when policymakers in New Jersey raised the minimum wage, employment in fast food restaurants did not decline relative to those in the neighboring state of Pennsylvania. Their research had real-world implications and broke new ground in research methods; it also brought to the fore profoundly unsettling theoretical questions.

When Card and Krueger published their analysis, “natural experiments” were a new idea in economics research—and Card gives credit to Harvard economist Richard Freeman as “the first person” he heard use the phrase. These techniques, alongside other methods, allowed economists to estimate causality—that is, to show that one thing caused another, rather than simply being able to say that two things seem to go together or move in tandem. As Diane Whitmore Schanzenbach at Northwestern University told me, “In the last 15 or 20 years or so, economics—empirical economics—has really undergone what we call the credibility revolution. In the old days, you could get away with doing correlational research that doesn’t have a format that allows you to extract the cause and effect between a policy and an outcome.”

The profession was not immediately comfortable with Card and Krueger’s research, balking at being told the world didn’t work the way theory predicted. Their 1995 book, Myth and Measurement , contained a set of studies that laid bare a conundrum at the core of economic theory. The reception was cold at best. At an event at Equitable Growth to mark the twentieth anniversary edition, Krueger recalled a prominent economist in the audience at an event for the first edition saying, “Theory is evidence too.” Indeed, when Krueger passed away in March, his obituary in The Washington Post quoted University of Chicago economist James J. Heckman, who in 1999 told The New York Times, “They don’t root their findings in existing theory and research. That is where I really deviate from these people. Each day you can wake up with a brand new world, not plugged into the previous literature.”

History tells a different story. Card and Krueger would go on to become greats in their fields and lead a new generation of scholars to new conclusions using innovative empirical techniques. Krueger’s recent passing earlier this year illustrated the extent of this transformation in economics. The discipline is now grounded in empirical evidence that focuses on proving causality, even if that does not conform to long-standing theoretical assumptions about how the economy works. New methods, such as natural or quasi experiments that examine how people react to policies across time or place, are now the industry standard.

While in hindsight it might seem incredible that this discipline could have existed without these empirical techniques, their widespread adoption only came late in the twentieth century, alongside the dawn of the Information Age and advances in empirical methods, access to data, and computing power. As one journalist put it, “[N]umber crunching on PCs made noodling on chalkboards passé.” One piece of evidence for this is that the top economics journals now mostly publish empirical papers. Whereas in the 1960s about half of the papers in the top three economics journals were theoretical, about four in five now rely on empirical analysis—and of those, over a third used the researcher’s own data set, while nearly one-in-ten are based on an experiment. Card and Krueger connect their findings to another game-changing development—the emergence of a set of ideas known as behavioral economics. This body of thought—along with much of feminist economics—starts from the premise that there is no such thing as the “rational economic man,” the theoretical invention required for economic theory to work. Krueger put it this way:

The standard textbook model, by contrast, views workers as atomistic. They just look at their own situation, their own self-interest, so whether someone else gets paid more or less than them doesn’t matter. The real world actually has to take into account these social comparisons and social considerations. And the field of behavioral economics recognizes this feature of human behavior and tries to model it. That thrust was going on, kind of parallel to our work, I’d say.

This is the definition of a paradigm shift. As a result of these changes, empirical research is now the route to join those in the top echelons of economics. While this may seem like a field looking within, it also appears to be a field on the cusp of change. Kuhn talks about how as a field matures, it becomes more specialized; as researchers dig into specific aspects of theory, they often then uncover a new paradigm buried in fresh examinations of the evidence. This kind of research commonly elevates anomalies—findings that don’t fit the prevailing theory.

How we make sense of this new empirical analysis will define the new paradigm. The policy world has been quick to take note of key pieces of this new body of empirical research. Indeed, evidence-backed policymaking has become the standard in many areas. But the nature of a paradigm shift means that policymakers are in need of a new framework to make sense of all the pieces of evidence and to guide their agenda. We can see this in current policy debates; while conservatives continue to tout tax cuts as the solution to all that ails us, the public no longer believes this to be the true remedy. Whether that means the agenda being discussed in many quarters to address inequality, by taxing capital and addressing rising economic concentration, will become core to the new framework remains to be seen.

A New Vision

A new focus on empirical analysis doesn’t necessarily mean a new paradigm. The evidence must be integrated into a new story of what economics is and seeks to understand. We can see something like this happening as scholars seek to understand inequality—what economists often refer to in empirical work as “heterogeneity.” Putting inequality at the core of the analysis pushes forward questions about whether the market performs the same for everyone—rich and poor, with economic power or without—and what that means for how the economy functions. It brings to the fore questions that cannot be ignored about how economic power translates into social power. Most famously, in Capital in the Twenty-First Century , Thomas Piketty brings together hundreds of years of income data from across a number of countries, and concludes from this that powerful forces push toward an extremely high level of inequality, so much so that capital will calcify as “the past tends to devour the future.”

That rethinking is happening right now. At January’s Allied Social Science Association conference—the gathering place for economists across all the sub-fields—UC Berkeley’s Gabriel Zucman put up a very provocative slide, which said only “Good-bye representative agent.” This slide was as revolutionary as Card and Krueger’s work decades before because it implied that we should let go of the workhorse macroeconomic models. For the most part, policymakers rely on so-called “representative agent models” to inform economic policy. These models assume that the responses of economic actors, be they firms or individuals, can be represented by one (or maybe two) sets of rules. That is, if conditions change—for example, a price rises—the model assumes that everyone in the economy responds in the same way, regardless of whether they are low income or high income. Moody’s Analytics houses a commonly cited economic model, led by economist Mark Zandi, who confirms this: “Most macroeconomists, at least those focused on the economy’s prospects, have all but ignored inequality in their thinking. Their implicit, if not explicit, assumption is that inequality doesn’t matter much when gauging the macroeconomic outlook.”

But these workhorse models of macroeconomics underperformed—to put it mildly—in the run up to the most recent financial crisis. They neither predicted the crisis nor provided reasonable estimates of economic growth moving forward as the Great Recession hit and then the slow recovery began. When Zandi integrated inequality into the Moodys macroeconomic forecasting model for the United States, he found that adding inequality to the traditional models—ones that do not take into account economic inequality at all—did not change the short-term forecasts very much. But when he looked at the long-term picture or considered the potential for the system to spin out of control, he found that higher inequality increases the likelihood of instability in the financial system.

This research is confirmed in a growing body of empirical work. This spring, International Monetary Fund economists Jonathan D. Ostry, Prakash Loungani, and Andrew Berg released a book pulling together a series of research studies showing the link between higher inequality levels and more frequent economic downturns. They find that when growth takes place in societies with high inequality, the economic gains are more likely to be destroyed by the more severe recessions and depressions that follow—and the economic pain is all too often compounded for those at the lower end of the income spectrum. Even so, as of now, most of the macroeconomic models used by central banks and financial firms for forecasting and decision-making don’t take inequality into account.

Key to any paradigm change is that there’s a new way of seeing the field of inquiry. That’s where Saez, along with many co-authors, including Thomas Piketty and Gabriel Zucman, are making their mark. They have created what they call “Distributional National Accounts,” which combine the aggregate data on National Income so important to the early twentieth century paradigm, with detailed data on how that income is allocated across individuals—incorporating the later-twentieth-century learning—into a measure that shows growth and its distribution. As Zucman told me, “We talk about growth, we talk about inequality; but never about the two together. And so distribution of national accounts, that’s an attempt at bridging the gap between macroeconomics on the one hand and inequality studies on the other hand.”

Congress seems to have gotten the message. The 2019 Consolidated Appropriations Act, which opened the government after a record 35-day shutdown, included language “encourag[ing]” the Bureau of Economic Analysis to “begin reporting on income growth indicators” by decile at least annually and to incorporate that work into its GDP releases if possible. In this way, step by step, new economic paradigms become new policymaking tools.

From the Symposium

Beyond neoliberalism, the moral vision after neoliberalism, 16 min read, read more about economics neoliberalism.

Heather Boushey is President and CEO of the Washington Center for Equitable Growth and author of Unbound: How Inequality Constricts Our Economy and What We Can Do About It .

Also by this author

Which side are we on, view comments.

"A multidisciplinary forum focused on the social consequences and policy implications of all forms of knowledge on a global basis"

We are pleased to introduce Eruditio, the electronic journal of the World Academy of Art & Science. The vision of the Journal complements and enhances the World Academy's focus on global perspectives in the generation of knowledge from all fields of legitimate inquiry. The Journal also mirrors the World Academy's specific focus and mandate which is to consider the social consequences and policy implications of knowledge in the broadest sense.

Eruditio Issues

Volume 2 issue 1.

  • Volume 2 Issue 2
  • Volume 2 Issue 3
  • Volume 2 Issue 4
  • Volume 2 Issue 5
  • Volume 2 Issue 6
  • Volume 3 Issue 1

Editorial Information

  • Editorial Board
  • Author Guidelines
  • Editorial Policy
  • Reference Style Guide

World Academy of Art & Science

  • Publications
  • Events & News

Introduction to the New Paradigm of Political Economic Theory

ARTICLE | November 27, 2015 | BY Winston P. Nagan

This short article is an attempt to provide a reasonably simplified introduction to a complex initiative. Influential Fellows in the World Academy of Art and Science, moved in part by the global crisis of unemployment and a conspicuous lack of theoretical engagement that might constructively respond to the problem, came to the conclusion that the reason for the silence of intellectual concern was because there was a dire need for new thinking about the importance of political economy and its salience for a defensible world order. Leading figures in the Academy, such as Orio Giarini, Ivo Šlaus, Garry Jacobs, Ian Johnson, and many others, have diligently worked on a new economic framework with the focus of the centrality of human capital as a critical foundation for economic prosperity.

This article seeks to contribute to a clear and more simplified description of the fundamental paradigms of traditional and emerging economic order. It seeks to set out the paradigmatic contours of classical theory, it moves from classical theory (the old normal) to the new normal in neoliberalism and then recommends a framework for the future that borrows from the new paradigm thinking of jurisprudence. It applies and summarizes these ideas as guidelines for the development of a theory about political economy as an inquiring system with a comprehensive focus and a fixed concentration on human-centered approach for the future. This approach summarizes articles the author wrote for Cadmus. *

1. The Background to Basic Theory and its Roots in laissez-faire

Economic Theory is a disputed field of intellectual endeavor. The stakes implicated in economic theory development are high and as a consequence theory is a contested domain. The contestation is intensified because the dominance of a particular theory will influence the social impact of that theory on human relations and this in turn will invite policy interventions and policy consequences. Within the arena of theoretical contestation, there has emerged a new normal for economic theory. We may regard this new normal as the conventional paradigm of economic theory. The new normal has come with various terms of identification, but the one that seems to be ascendant is encapsulated in the phrase “economic neoliberalism.” In a sense, economic neoliberalism draws powerful inspiration from the earliest iterations of the nature of economic activity. In the 18 th century, French officials adopted and popularized a phrase that would serve as both an empirical description of ideal economic exchanges, as well as a preferred model for the structure and function of the arena within which economic activity happened. The phrase was laissez-faire . In practice, this meant that the state should reduce its regulatory control over economic interactions within the body politic. The less regulation, the less interference there would be in the arena of economic activity. Less interference meant increased dynamism in the arena of economic productivity, distribution, and exchange.

In the latter part of the 18 th century, the moral philosopher Adam Smith published his famous book, The Wealth of Nations . Smith was aware of the principle of laissez-faire that had emerged in French practice. Indeed, he found this idea compatible with the theory of economic enterprise that was developed in his book. Smith provided both a description of how economics worked, and by implication, provided a justification for the importance of his model in improving the level of economic prosperity in society. Adam Smith was preeminently a moral philosopher with profound economic insight. In his book, The Theory of Moral Sentiments , he noted that the specialized capacities of human beings were not a matter coordinated by centralized authority and control. On the contrary, it was influenced by something more impersonal—the market. By the pursuit of economic self-interest and the system of pricing, human beings and their capacities are led to meet the needs of others, who they do not know and by mechanism, they do not comprehend.

The genius of Smith’s work lay in its simplicity using common sense ideas to sustain a level of understanding of the workings of economic order. Economic relations encompassed the supplier of goods and services and the demander of goods and services. The goods and services constituted property that was exchanged between supplier (S) and demander (D). The arena of this exchange between S & D was the market, which established a natural equilibrium when it functioned optimally and satisfied the self-interests of both S & D. We may look at Smith’s model as the old normal of economic theorizing. The importance of this model is its reinvention for the new normal model of economic neoliberalism.

2. The Influence of Value-Free Positivism

We must carefully remind ourselves that Adam Smith was at heart a moral philosopher. This particular understanding of the role of the market became a central feature of his work, largely because subsequent economists committed to the positivist approach to the study of economics received no inspiration to moderate the dynamics of autonomous market, with the untidy implications of collective and individual social responsibility. Vitally important to this approach was its strenuous justification of insulating economics from social reality and social responsibility. Indeed, positivism as a science went much further. It excluded normative discourse and its value implications because values were essentially non-science. At the time, there did not exist a credible science of society as well. Adam Smith’s theoretical meditations did not subscribe to this as modern scholarship has amply demonstrated. † Braham has isolated four precepts in the corpus of Smith’s writings which clarify this issue. First, there is the assumption that when people are left alone to pursue their own interests, there rides along with this dynamic an invisible hand that indicates that society will benefit from this conduct as a whole. This idea is moderated by Smith’s moral egalitarianism, which implies that every person has equal moral worth. This brings us to Smith’s ideas of social justice, which are connected to moral egalitarianism. Here Smith was deeply influenced by the jurisprudence of natural law. Natural Law makes a distinction between commutative justice and distributive justice. In the latter, justice is done according to the right one has to compensate for a legal wrong done. The former is more complex. Smith’s work is permeated with discussions of the foundations of distributive justice.

Following this classical tradition, distributive justice is equated with beneficence, the application of ‘charity and generosity’ based on an individual or social assessment of ‘merit.’ Under this notion the rules that assign particular objects to particular persons, which is the nub of the concept of distributive justice, is a private and not a public matter or one of social norms; it is not a duty of the society at large and no one has a claim in morality against others to alleviate their condition. Smith subsumes this notion of justice under ‘all the social virtues’. ‡

Under the influence of the old normal model of economic theorizing, modern science added an important dimension to the evolution of the old model. In the early 19 th century, the social sciences and law came under the influence of positivism. The positivist impulse was meant to bring intellectual rigor, a rigorous commitment to objectivity, and an insistence that scientific inquiry be completely separated from inquiry into values, morals, and ethics. The influence of science and mathematics on economics has been enormous. Credible scientific work in economics required a reliance on mathematics and mathematical abstractions. This tended to remove theory from the critical scrutiny of intellectuals untrained in mathematics.

In the context of infusing complex mathematical equations into the theory of economics, the trend led to a greater formalization of economic theory and as a consequence, the formalistic emphasis was further abstracted from the concrete conditions of social life and human problems. Moreover, the principle that the market established an abstract equilibrium of absolute efficiency seemed to be conventional wisdom in policy-making circles. This approach to economic organization received a severe setback between 1929 and 1933. The conventional wisdom at the time was that the laissez-faire approach to a weakly regulated economy was the cause of the Great Depression, and there was no natural force within the market to self-regulate the economy out of the Depression. In later years a single American economist, Milton Friedman, claimed that the Depression was not a failure of the free natural market, but rather a failure of government policy. The government did not sufficiently monetize the economy and within three years the amount of money in the economy was reduced by a third. This he claimed was the cause of the Depression and not the fidelity to a weakly regulated market.

3. The New Normal in Economic Theory: Economic Neoliberalism − Milton Friedman and the University of Chicago’s Economic Department

Milton Friedman is generally acknowledged to be the architect of the New Normal Paradigm of economic thinking. He was a leader of the University of Chicago’s Economics Department, which was the institutional base for the New Normal Paradigm. The two significant influences that had emerged in particular after the Second World War was the Keynesian influenced American New Deal and the reach of Stalin’s influence in Eastern Europe. From the perspective of Friedman and his colleagues, the New Deal was a form of creeping socialism, and an indirect threat to freedom. With regard to Stalin’s socialism and its extinction of private property, the Stalinist State control of the economy was quite simply an extinction of freedom. In 1947, Friedrich von Hayek, Milton Friedman, and others formed the Mont Pelerin Society to address these questions intellectually.

The fundamentals of economic neoliberalism insist upon a radical privatization of property and value in society. In short: if a matter may be privatized, it should be privatized. Additionally, economic neoliberalism favored the notion of the minimal state. In short: the more deregulation and limitation on the state’s power to regulate, the better. A strong belief in corporate tax cuts and reduced taxes for the wealthy. A strong belief in trade liberalization and open markets. Finally, with regard to the minimal state, there would be a massive diminution of the role of government in society: The writer Tayyab Mahmud describes economic neoliberalism as follows:

The neoliberal project is to turn the “nation-state,” one with the primary agenda of facilitating global capital accumulation unburdened from any legal regulations aimed at assuring welfare of citizens. In summary, neoliberalism seeks unbridled accumulation of capital through a rollback of the state, and limits its functions to minimal security and maintenance of law, fiscal and monetary discipline, flexible labor markets, and liberalization of trade and capital flows.

Friedman made several strong arguments as to why governmental intervention into the market is generally futile, or leaves the economy worse than it was without the intervention. These arguments were formed around the ideas of adaptive expectations and rational expectations. With regard to adaptive expectations, Friedman demonstrated that the government printing money increased inflation and businessmen neutralized the rate of increase in the money supply by predicting it. The rational expectations argument was based on the idea that the market would predict and undermine government intervention. These ideas were meant to show that markets are indeed self-regulating and that regulation is both unnecessary and dysfunctional. There are a vast range of critiques of economic neoliberalism, but the critique of N. Chomsky seems to be one of the most compelling.

Neoliberalism is actually closer to corporatism than any other philosophy in that, in its abandonment of the traditional regulatory function of the state and embracing of corporate goals and objectives, it cedes sovereignty over how its economy and society and are organized to a global cabal of corporate elite.

Since the economic crisis of 2008, the criticisms of economic neoliberalism have also focused on the deregulation of the global financial system. The critique of the financial system is that it is organized along the lines of a gambler’s nirvana. Additionally, this is an economic model that could not predict the financial catastrophe that was to accompany the crisis. The consequences of the theory and its practice have also led to a global crisis of radical inequality. In addition, the consequences of the theory would reflect on its absence of a credible theory of sustainable development. This is a theory that resists the concern of the impact of the economy on environmental degradation and climate change. Finally, the radical exclusion of values from economic theory means that the assignment of responsibility to the private sector for mismanagement and dangerous conduct is undermined.

The central thrust of our emphasis is to deemphasize the abstract formalism of economic neoliberalism pseudoscience and to develop a comprehensive theory for inquiry into economic phenomena from the local to the most comprehensive Earth-Space context. We recognize that putting theory into the most comprehensive context generates complexity and a critical need for expeditious knowledge integration. In short, economics should be enriched and informed by sociology, anthropology, political science, the psychological sciences, as well as lessons from the enhanced methods of the physical sciences. Therefore, our theory and method for inquiry set out as their initial task, the development of a theory that describes economics as it is in the broadest eco-social context.

4. The Fundamentals of a New Paradigm of Political Economy

The search for a new paradigm of political economy is in effect the search for a theory about political economy that should be comprehensive enough to embrace the context of the entire earth-space community. It must also be particular in adequately accounting for the specific localized effects of economic theory, policy, and practice. To this end, a new paradigm theory of political economy should include the following emphases:

  • It must have a comprehensive global eco-social focus for relevant inquiry. This means theory must not only transcend but also include the relevance of the sovereign state while stressing the importance of transnational causes and consequences of economically related behavior. In particular, it must acknowledge the salience of the global inter-determination of economic perspectives and operations.
  • It must engage in normative, value-based description and analysis including a clarification of the basic goal values of current world order. It must use these as markers to clarify the basic community policies implicated in all economic cooperation and contestation. Here the all inclusive value of universal human dignity may be a critical principle of political-economic normative guidance.
  • Political economy is not animated by an autonomous machine. It is given dynamism by a sustained advocacy and very critically the vital importance of both authoritative and controlling decision making. The critical role of decision is a mandated focus of professional responsibility as well as responsible inquiry.
  • Just as political economy must account for the structure of authority and control in the sovereign state, it must be alert to the principal features of global constitutional order. In particular, it must be alert to the way in which global constitutional order and its decision processes shape the evolving domains of world order.
  • The evolving new paradigm of political economy must engage in the scientific task of illuminating and devaluating the conditions that inspire political economic outcomes. In short, it is a task that requires the identification and analysis of political and economically relevant causes and consequences that influence economic outcomes.
  • The evolving new paradigm theory of political economy must consciously seek to anticipate and examine all possible relevant future scenarios to enhance the rationality of this function of theory, this function may well be guided by the clarification of the value bases that are desired for future scenarios.
  • The new paradigm of political economy must infuse itself with the most important element of the human faculty—human creativity. In particular, this means that the new paradigm must focus on the alternative possibilities that may be anticipated from relevant future scenarios. This focus should have the creative element that creates the prospect of imaginative but realizable future outcomes that are compatible with the basic fundamental values that represent the common interest of the community as a whole.

* See http://cadmusjournal.org/author/winston-p-nagan-0

† See Matthew Braham, Adam Smith’s Concept of Social Justice , August 14, 2006

‡ Id . at 1

About the Author(s)

  • Login to post comments

Subscribe here to free electronic edition

Lead articles.

Determinism and Reification: The Twin Pillars of the Amoral Society

- Gerald Gutenschwager

Breaking Free: Bringing the Overview Effect to Life and Work

- Charles Smith

Higher Education -Cornerstone of the New Era

- Federico Mayor

The Future of International Law

- John Scales Avery

EU between Monetarism and Keynesianism

- Mladen Staničić & Josip Sapunar

- Winston P. Nagan

Collabrocracy: Collaborative Intelligence and Governance of Globalised Society

- Dimitar Tchurovsky

Reforming Electronic Markets and Trading

- Hazel Henderson

Remarks on Visions of Sustainable Development

- Robert J. Berg

Analysis and Assessment of the Right to Peace in Light of the Latest Developments at the Human Rights Council

- Christian Guillermet-Fernández & David Fernández Puyana

Sustainability, Past and Future

- Michael Marien

Book Review

Download Volume 2 Issue 1

Terms of Use | Copyright Policy | Privacy Policy

  • Tools and Resources
  • Customer Services
  • Econometrics, Experimental and Quantitative Methods
  • Economic Development
  • Economic History
  • Economic Theory and Mathematical Models
  • Environmental, Agricultural, and Natural Resources Economics
  • Financial Economics
  • Health, Education, and Welfare Economics
  • History of Economic Thought
  • Industrial Organization
  • International Economics
  • Labor and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Micro, Behavioral, and Neuro-Economics
  • Public Economics and Policy
  • Urban, Rural, and Regional Economics
  • Save Search
  • Share This Facebook LinkedIn Twitter

Perform this search in

  • Oxford Research Encyclopedias

Narrow Your Choices Narrow Your Choices

Modify your search.

  • Full Article (28)

Availability

  • Econometrics, Experimental and Quantitative Methods (12)
  • Economic Development (3)
  • Economic History (3)
  • Economic Theory and Mathematical Models (28)
  • Environmental, Agricultural, and Natural Resources Economics (2)
  • Financial Economics (4)
  • Health, Education, and Welfare Economics (4)
  • International Economics (4)
  • Labor and Demographic Economics (3)
  • Macroeconomics and Monetary Economics (6)
  • Micro, Behavioral, and Neuro-Economics (4)
  • Public Economics and Policy (1)
  • Urban, Rural, and Regional Economics (1)

  1 - 20 of 28 Results  for :

  • Economic Theory and Mathematical Models x

Anthropometrics: The Intersection of Economics and Human Biology  

John komlos, publication history:.

Anthropometrics is a research program that explores the extent to which economic processes affect human biological processes using height and weight as markers. This agenda differs from health economics in the sense that instead of studying diseases or longevity, macro manifestations of well-being, it focuses on cellular-level processes that determine the extent to which the organism thrives in its socio-economic and epidemiological environment. Thus, anthropometric indicators are used as a proxy measure for the biological standard of living as complements to conventional measures based on monetary units. Using physical stature as a marker, we enabled the profession to learn about the well-being of children and youth for whom market-generated monetary data are not abundant even in contemporary societies. It is now clear that economic transformations such as the onset of the Industrial Revolution and modern economic growth were accompanied by negative externalities that were hitherto unknown. Moreover, there is plenty of evidence to indicate that the Welfare States of Western and Northern Europe take better care of the biological needs of their citizens than the market-oriented health-care system of the United States. Obesity has reached pandemic proportions in the United States affecting 40% of the population. It is fostered by a sedentary and harried lifestyle, by the diminution in self-control, the spread of labor-saving technologies, and the rise of instant gratification characteristic of post-industrial society. The spread of television and a fast-food culture in the 1950s were watershed developments in this regard that accelerated the process. Obesity poses a serious health risk including heart disease, stroke, diabetes, and some types of cancer and its cost reaches $150 billion per annum in the United States or about $1,400 per capita. We conclude that the economy influences not only mortality and health but reaches bone-deep into the cellular level of the human organism. In other words, the economy is inextricably intertwined with human biological processes.

A Survey of Econometric Approaches to Convergence Tests of Emissions and Measures of Environmental Quality  

Junsoo lee, james e. payne, and md. towhidul islam.

The analysis of convergence behavior with respect to emissions and measures of environmental quality can be categorized into four types of tests: absolute and conditional β-convergence, σ-convergence, club convergence, and stochastic convergence. In the context of emissions, absolute β-convergence occurs when countries with high initial levels of emissions have a lower emission growth rate than countries with low initial levels of emissions. Conditional β-convergence allows for possible differences among countries through the inclusion of exogenous variables to capture country-specific effects. Given that absolute and conditional β-convergence do not account for the dynamics of the growth process, which can potentially lead to dynamic panel data bias, σ-convergence evaluates the dynamics and intradistributional aspects of emissions to determine whether the cross-section variance of emissions decreases over time. The more recent club convergence approach tests the decline in the cross-sectional variation in emissions among countries over time and whether heterogeneous time-varying idiosyncratic components converge over time after controlling for a common growth component in emissions among countries. In essence, the club convergence approach evaluates both conditional σ- and β-convergence within a panel framework. Finally, stochastic convergence examines the time series behavior of a country’s emissions relative to another country or group of countries. Using univariate or panel unit root/stationarity tests, stochastic convergence is present if relative emissions, defined as the log of emissions for a particular country relative to another country or group of countries, is trend-stationary. The majority of the empirical literature analyzes carbon dioxide emissions and varies in terms of both the convergence tests deployed and the results. While the results supportive of emissions convergence for large global country coverage are limited, empirical studies that focus on country groupings defined by income classification, geographic region, or institutional structure (i.e., EU, OECD, etc.) are more likely to provide support for emissions convergence. The vast majority of studies have relied on tests of stochastic convergence with tests of σ-convergence and the distributional dynamics of emissions less so. With respect to tests of stochastic convergence, an alternative testing procedure accounts for structural breaks and cross-correlations simultaneously is presented. Using data for OECD countries, the results based on the inclusion of both structural breaks and cross-correlations through a factor structure provides less support for stochastic convergence when compared to unit root tests with the inclusion of just structural breaks. Future studies should increase focus on other air pollutants to include greenhouse gas emissions and their components, not to mention expanding the range of geographical regions analyzed and more robust analysis of the various types of convergence tests to render a more comprehensive view of convergence behavior. The examination of convergence through the use of eco-efficiency indicators that capture both the environmental and economic effects of production may be more fruitful in contributing to the debate on mitigation strategies and allocation mechanisms.

Bayesian Statistical Economic Evaluation Methods for Health Technology Assessment  

Andrea gabrio, gianluca baio, and andrea manca.

The evidence produced by healthcare economic evaluation studies is a key component of any Health Technology Assessment (HTA) process designed to inform resource allocation decisions in a budget-limited context. To improve the quality (and harmonize the generation process) of such evidence, many HTA agencies have established methodological guidelines describing the normative framework inspiring their decision-making process. The information requirements that economic evaluation analyses for HTA must satisfy typically involve the use of complex quantitative syntheses of multiple available datasets, handling mixtures of aggregate and patient-level information, and the use of sophisticated statistical models for the analysis of non-Normal data (e.g., time-to-event, quality of life and costs). Much of the recent methodological research in economic evaluation for healthcare has developed in response to these needs, in terms of sound statistical decision-theoretic foundations, and is increasingly being formulated within a Bayesian paradigm. The rationale for this preference lies in the fact that by taking a probabilistic approach, based on decision rules and available information, a Bayesian economic evaluation study can explicitly account for relevant sources of uncertainty in the decision process and produce information to identify an “optimal” course of actions. Moreover, the Bayesian approach naturally allows the incorporation of an element of judgment or evidence from different sources (e.g., expert opinion or multiple studies) into the analysis. This is particularly important when, as often occurs in economic evaluation for HTA, the evidence base is sparse and requires some inevitable mathematical modeling to bridge the gaps in the available data. The availability of free and open source software in the last two decades has greatly reduced the computational costs and facilitated the application of Bayesian methods and has the potential to improve the work of modelers and regulators alike, thus advancing the fields of economic evaluation of healthcare interventions. This chapter provides an overview of the areas where Bayesian methods have contributed to the address the methodological needs that stem from the normative framework adopted by a number of HTA agencies.

The Biological Foundations of Economic Preferences  

Nikolaus robalino and arthur robson.

Modern economic theory rests on the basic assumption that agents’ choices are guided by preferences. The question of where such preferences might have come from has traditionally been ignored or viewed agnostically. The biological approach to economic behavior addresses the issue of the origins of economic preferences explicitly. This approach assumes that economic preferences are shaped by the forces of natural selection. For example, an important theoretical insight delivered thus far by this approach is that individuals ought to be more risk averse to aggregate than to idiosyncratic risk. Additionally the approach has delivered an evolutionary basis for hedonic and adaptive utility and an evolutionary rationale for “theory of mind.” Related empirical work has studied the evolution of time preferences, loss aversion, and explored the deep evolutionary determinants of long-run economic development.

Consumer Debt and Default: A Macro Perspective  

Florian exler and michèle tertilt.

Consumer debt is an important means for consumption smoothing. In the United States, 70% of households own a credit card, and 40% borrow on it. When borrowers cannot (or do not want to) repay their debts, they can declare bankruptcy, which provides additional insurance in tough times. Since the 2000s, up to 1.5% of households declared bankruptcy per year. Clearly, the option to default affects borrowing interest rates in equilibrium. Consequently, when assessing (welfare) consequences of different bankruptcy regimes or providing policy recommendations, structural models with equilibrium default and endogenous interest rates are needed. At the same time, many questions are quantitative in nature: the benefits of a certain bankruptcy regime critically depend on the nature and amount of risk that households bear. Hence, models for normative or positive analysis should quantitatively match some important data moments. Four important empirical patterns are identified: First, since 1950, consumer debt has risen constantly, and it amounted to 25% of disposable income by 2016. Defaults have risen since the 1980s. Interestingly, interest rates remained roughly constant over the same time period. Second, borrowing and default clearly depend on age: both measures exhibit a distinct hump, peaking around 50 years of age. Third, ownership of credit cards and borrowing clearly depend on income: high-income households are more likely to own a credit card and to use it for borrowing. However, this pattern was stronger in the 1980s than in the 2010s. Finally, interest rates became more dispersed over time: the number of observed interest rates more than quadrupled between 1983 and 2016. These data have clear implications for theory: First, considering the importance of age, life cycle models seem most appropriate when modeling consumer debt and default. Second, bankruptcy must be costly to support any debt in equilibrium. While many types of costs are theoretically possible, only partial repayment requirements are able to quantitatively match the data on filings, debt levels, and interest rates simultaneously. Third, to account for the long-run trends in debts, defaults, and interest rates, several quantitative theory models identify a credit expansion along the intensive and extensive margin as the most likely source. This expansion is a consequence of technological advancements. Many of the quantitative macroeconomic models in this literature assess welfare effects of proposed reforms or of granting bankruptcy at all. These welfare consequences critically hinge on the types of risk that households face—because households incur unforeseen expenditures, not-too-stringent bankruptcy laws are typically found to be welfare superior to banning bankruptcy (or making it extremely costly) but also to extremely lax bankruptcy rules. There are very promising opportunities for future research related to consumer debt and default. Newly available data in the United States and internationally, more powerful computational resources allowing for more complex modeling of household balance sheets, and new loan products are just some of many promising avenues.

Contests: Theory and Topics  

Qiang fu and zenan wu.

Competitive situations resembling contests are ubiquitous in modern economic landscape. In a contest, economic agents expend costly effort to vie for limited prizes, and they are rewarded for “getting ahead” of their opponents instead of their absolute performance metrics. Many social, economic, and business phenomena exemplify such competitive schemes, ranging from college admissions, political campaigns, advertising, and organizational hierarchies, to warfare. The economics literature has long recognized contest/tournament as a convenient and efficient incentive scheme to remedy the moral hazard problem, especially when the production process is subject to random perturbation or the measurement of input/output is imprecise or costly. An enormous amount of scholarly effort has been devoted to developing tractable theoretical models, unveiling the fundamentals of the strategic interactions that underlie such competitions, and exploring the optimal design of contest rules. This voluminous literature has enriched basic contest/tournament models by introducing different variations to the modeling, such as dynamic structure, incomplete and asymmetric information, multi-battle confrontations, sorting and entry, endogenous prize allocation, competitions in groups, contestants with alternative risk attitude, among other things.

Econometrics for Modelling Climate Change  

Jennifer l. castle and david f. hendry.

Shared features of economic and climate time series imply that tools for empirically modeling nonstationary economic outcomes are also appropriate for studying many aspects of observational climate-change data. Greenhouse gas emissions, such as carbon dioxide, nitrous oxide, and methane, are a major cause of climate change as they cumulate in the atmosphere and reradiate the sun’s energy. As these emissions are currently mainly due to economic activity, economic and climate time series have commonalities, including considerable inertia, stochastic trends, and distributional shifts, and hence the same econometric modeling approaches can be applied to analyze both phenomena. Moreover, both disciplines lack complete knowledge of their respective data-generating processes (DGPs), so model search retaining viable theory but allowing for shifting distributions is important. Reliable modeling of both climate and economic-related time series requires finding an unknown DGP (or close approximation thereto) to represent multivariate evolving processes subject to abrupt shifts. Consequently, to ensure that DGP is nested within a much larger set of candidate determinants, model formulations to search over should comprise all potentially relevant variables, their dynamics, indicators for perturbing outliers, shifts, trend breaks, and nonlinear functions, while retaining well-established theoretical insights. Econometric modeling of climate-change data requires a sufficiently general model selection approach to handle all these aspects. Machine learning with multipath block searches commencing from very general specifications, usually with more candidate explanatory variables than observations, to discover well-specified and undominated models of the nonstationary processes under analysis, offers a rigorous route to analyzing such complex data. To do so requires applying appropriate indicator saturation estimators (ISEs), a class that includes impulse indicators for outliers, step indicators for location shifts, multiplicative indicators for parameter changes, and trend indicators for trend breaks. All ISEs entail more candidate variables than observations, often by a large margin when implementing combinations, yet can detect the impacts of shifts and policy interventions to avoid nonconstant parameters in models, as well as improve forecasts. To characterize nonstationary observational data, one must handle all substantively relevant features jointly: A failure to do so leads to nonconstant and mis-specified models and hence incorrect theory evaluation and policy analyses.

The Effects of Monetary Policy Announcements  

Chao gu, han han, and randall wright.

The effects of news (i.e., information innovations) are studied in dynamic general equilibrium models where liquidity matters. As a leading example, news can be announcements about monetary policy directions. In three standard theoretical environments—an overlapping generations model of fiat currency, a new monetarist model accommodating multiple payment methods, and a model of unsecured credit—transition paths are constructed between an announcement and the date at which events are realized. Although the economics is different, in each case, news about monetary policy can induce volatility in financial and other markets, with transitions displaying booms, crashes, and cycles in prices, quantities, and welfare. This is not the same as volatility based on self-fulfilling prophecies (e.g., cyclic or sunspot equilibria) studied elsewhere. Instead, the focus is on the unique equilibrium that is stationary when parameters are constant but still delivers complicated dynamics in simple environments due to information and liquidity effects. This is true even for classically-neutral policy changes. The induced volatility can be bad or good for welfare, but using policy to exploit this in practice seems difficult because outcomes are very sensitive to timing and parameters. The approach can be extended to include news of real factors, as seen in examples.

Fractional Integration and Cointegration  

Javier hualde and morten ørregaard nielsen.

Fractionally integrated and fractionally cointegrated time series are classes of models that generalize standard notions of integrated and cointegrated time series. The fractional models are characterized by a small number of memory parameters that control the degree of fractional integration and/or cointegration. In classical work, the memory parameters are assumed known and equal to 0, 1, or 2. In the fractional integration and fractional cointegration context, however, these parameters are real-valued and are typically assumed unknown and estimated. Thus, fractionally integrated and fractionally cointegrated time series can display very general types of stationary and nonstationary behavior, including long memory, and this more general framework entails important additional challenges compared to the traditional setting. Modeling, estimation, and testing in the context of fractional integration and fractional cointegration have been developed in time and frequency domains. Related to both alternative approaches, theory has been derived under parametric or semiparametric assumptions, and as expected, the obtained results illustrate the well-known trade-off between efficiency and robustness against misspecification. These different developments form a large and mature literature with applications in a wide variety of disciplines.

Frequency-Domain Approach in High-Dimensional Dynamic Factor Models  

Marco lippi.

High-Dimensional Dynamic Factor Models have their origin in macroeconomics, precisely in empirical research on Business Cycles. The central idea, going back to the work of Burns and Mitchell in the years 1940, is that the fluctuations of all the macro and sectoral variables in the economy are driven by a “reference cycle,” that is, a one-dimensional latent cause of variation. After a fairly long process of generalization and formalization, the literature settled at the beginning of the year 2000 on a model in which (1) both n the number of variables in the dataset and T , the number of observations for each variable, may be large, and (2) all the variables in the dataset depend dynamically on a fixed independent of n , a number of “common factors,” plus variable-specific, usually called “idiosyncratic,” components. The structure of the model can be exemplified as follows: x i t = α i u t + β i u t − 1 + ξ i t , i = 1, … , n , t = 1, … , T , (*) where the observable variables x i t are driven by the white noise u t , which is common to all the variables, the common factor, and by the idiosyncratic component ξ i t . The common factor u t is orthogonal to the idiosyncratic components ξ i t , the idiosyncratic components are mutually orthogonal (or weakly correlated). Lastly, the variations of the common factor u t affect the variable x i t dynamically, that is through the lag polynomial α i + β i L . Asymptotic results for High-Dimensional Factor Models, particularly consistency of estimators of the common factors, are obtained for both n and T tending to infinity. Model ( ∗ ) , generalized to allow for more than one common factor and a rich dynamic loading of the factors, has been studied in a fairly vast literature, with many applications based on macroeconomic datasets: (a) forecasting of inflation, industrial production, and unemployment; (b) structural macroeconomic analysis; and (c) construction of indicators of the Business Cycle. This literature can be broadly classified as belonging to the time- or the frequency-domain approach. The works based on the second are the subject of the present chapter. We start with a brief description of early work on Dynamic Factor Models. Formal definitions and the main Representation Theorem follow. The latter determines the number of common factors in the model by means of the spectral density matrix of the vector ( x 1 t x 2 t ⋯ x n t ) . Dynamic principal components, based on the spectral density of the x ’s, are then used to construct estimators of the common factors. These results, obtained in early 2000, are compared to the literature based on the time-domain approach, in which the covariance matrix of the x ’s and its (static) principal components are used instead of the spectral density and dynamic principal components. Dynamic principal components produce two-sided estimators, which are good within the sample but unfit for forecasting. The estimators based on the time-domain approach are simple and one-sided. However, they require the restriction of finite dimension for the space spanned by the factors. Recent papers have constructed one-sided estimators based on the frequency-domain method for the unrestricted model. These results exploit results on stochastic processes of dimension n that are driven by a q -dimensional white noise, with q

General Equilibrium Theory of Land  

Masahisa fujita.

Land is everywhere: the substratum of our existence. In addition, land is intimately linked to the dual concept of location in human activity. Together, land and location are essential ingredients for the lives of individuals as well as for national economies. In the early 21st century, there exist two different approaches to incorporating land and location into a general equilibrium theory. Dating from the classic work of von Thünen (1826), a rich variety of land-location density models have been developed. In a density model, a continuum of agents is distributed over a continuous location space. Given that simple calculus can be used in the analysis, these density models continue to be the “workhorse” of urban economics and location theory. However, the behavioral meaning of each agent occupying an infinitesimal “density of land” has long been in question. Given this situation, a radically new approach, called the σ -field approach, was developed in the mid-1980s for modeling land in a general equilibrium framework. In this approach: (1) the totality of land, L , is specified as a subset of ℝ 2 , (2) all possible land parcels in L are given by the σ -field of Lebesgue measurable subsets of L , and (3) each of a finite number of agents is postulated to choose one such parcel. Starting with Berliant (1985), increasingly more sophisticated σ -field models of land have been developed. Given these two different approaches to modeling land within a general equilibrium framework, several attempts have thus far been proposed for bridging the gap between them. But while a systematic study of the relationship between density models and σ -field models remains to be completed, the clarification of this relationship could open a new horizon toward a general equilibrium theory of land.

Geography, Trade, and Power-Law Phenomena  

Pao-li chang and wen-tai hsu.

This article reviews interrelated power-law phenomena in geography and trade. Given the empirical evidence on the gravity equation in trade flows across countries and regions, its theoretical underpinnings are reviewed. The gravity equation amounts to saying that trade flows follow a power law in distance (or geographic barriers). It is concluded that in the environment with firm heterogeneity, the power law in firm size is the key condition for the gravity equation to arise. A distribution is said to follow a power law if its tail probability follows a power function in the distribution’s right tail. The second part of this article reviews the literature that provides the microfoundation for the power law in firm size and reviews how this power law (in firm size) may be related to the power laws in other distributions (in incomes, firm productivity and city size).

The Implications of School Assignment Mechanisms for Efficiency and Equity  

Atila abdulkadiroğlu.

Parental choice over public schools has become a major policy tool to combat inequality in access to schools. Traditional neighborhood-based assignment is being replaced by school choice programs, broadening families’ access to schools beyond their residential location. Demand and supply in school choice programs are cleared via centralized admissions algorithms. Heterogeneous parental preferences and admissions policies create trade-offs among efficiency and equity. The data from centralized admissions algorithms can be used effectively for credible research design toward better understanding of school effectiveness, which in turn can be used for school portfolio planning and student assignment based on match quality between students and schools.

Improving on Simple Majority Voting by Alternative Voting Mechanisms  

Jacob k. goeree, philippos louis, and jingjing zhang.

Majority voting is the predominant mechanism for collective decision making. It is used in a broad range of applications, spanning from national referenda to small group decision making. It is simple, transparent, and induces voters to vote sincerely. However, it is increasingly recognized that it has some weaknesses. First of all, majority voting may lead to inefficient outcomes. This happens because it does not allow voters to express the intensity of their preferences. As a result, an indifferent majority may win over an intense minority. In addition, majority voting suffers from the “tyranny of the majority,” i.e., the risk of repeatedly excluding minority groups from representation. A final drawback is the “winner-take-all” nature of majority voting, i.e., it offers no compensation for losing voters. Economists have recently proposed various alternative mechanisms that aim to produce more efficient and more equitable outcomes. These can be classified into three different approaches. With storable votes, voters allocate a budget of votes across several issues. Under vote trading, voters can exchange votes for money. Under linear voting or quadratic voting, voters can buy votes at a linear or quadratic cost respectively. The properties of different alternative mechanisms can be characterized using theoretical modeling and game theoretic analysis. Lab experiments are used to test theoretical predictions and evaluate their fitness for actual use in applications. Overall, these alternative mechanisms hold the promise to improve on majority voting but have their own shortcomings. Additional theoretical analysis and empirical testing is needed to produce a mechanism that robustly delivers efficient and equitable outcomes.

Incentives and Performance of Healthcare Professionals  

Martin chalkley.

Economists have long regarded healthcare as a unique and challenging area of economic activity on account of the specialized knowledge of healthcare professionals (HCPs) and the relatively weak market mechanisms that operate. This places a consideration of how motivation and incentives might influence performance at the center of research. As in other domains economists have tended to focus on financial mechanisms and when considering HCPs have therefore examined how existing payment systems and potential alternatives might impact on behavior. There has long been a concern that simple arrangements such as fee-for-service, capitation, and salary payments might induce poor performance, and that has led to extensive investigation, both theoretical and empirical, on the linkage between payment and performance. An extensive and rapidly expanded field in economics, contract theory and mechanism design, had been applied to study these issues. The theory has highlighted both the potential benefits and the risks of incentive schemes to deal with the information asymmetries that abound in healthcare. There has been some expansion of such schemes in practice but these are often limited in application and the evidence for their effectiveness is mixed. Understanding why there is this relatively large gap between concept and application gives a guide to where future research can most productively be focused.

The Indeterminacy School in Macroeconomics  

Roger e. a. farmer.

The indeterminacy school in macroeconomics exploits the fact that macroeconomic models often display multiple equilibria to understand real-world phenomena. There are two distinct phases in the evolution of its history. The first phase began as a research agenda at the University of Pennsylvania in the United States and at CEPREMAP in Paris in the early 1980s. This phase used models of dynamic indeterminacy to explain how shocks to beliefs can temporarily influence economic outcomes. The second phase was developed at the University of California Los Angeles in the 2000s. This phase used models of incomplete factor markets to explain how shocks to beliefs can permanently influence economic outcomes. The first phase of the indeterminacy school has been used to explain volatility in financial markets. The second phase of the indeterminacy school has been used to explain periods of high persistent unemployment. The two phases of the indeterminacy school provide a microeconomic foundation for Keynes’ general theory that does not rely on the assumption that prices and wages are sticky.

Leverage Cycle Theory of Economic Crises and Booms  

John geanakoplos.

Traditionally, booms and busts have been attributed to investors’ excessive or insufficient demand, irrational exuberance and panics, or fraud. The leverage cycle begins with the observation that much of demand is facilitated by borrowing and that crashes often occur simultaneously with the withdrawal of lending. Uncertainty scares lenders before investors. Lenders are worried about default and therefore attach credit terms like collateral or minimum credit ratings to their contracts. The credit surface, depicting interest rates as a function of the credit terms, emerges in leverage cycle equilibrium. The leverage cycle is about booms when credit terms, especially collateral, are chosen to be loose, and busts when they suddenly become tight, in contrast to the traditional fixation on the (riskless) interest rate. Leverage cycle crashes are triggered at the top of the cycle by scary bad news, which has three effects. The bad news reduces every agent’s valuation of the asset. The increased uncertainty steepens the credit surface, causing leverage to plummet on new loans, explaining the withdrawal of credit. The high valuation leveraged investors holding the asset lose wealth when the price falls; if their debts are due, they lose liquid wealth and face margin calls. Each effect feeds back and exacerbates the others and increases the uncertainty. The credit surface is steeper for long loans than short loans because uncertainty is higher. Investors respond by borrowing short, creating a maturity mismatch and voluntarily exposing themselves to margin calls. When uncertainty rises, the credit surface steepens more for low credit rating agents than for high rated agents, leading to more inequality.. The leverage cycle also applies to banks, leading to a theory of insolvency runs rather than panic runs. The leverage cycle policy implication for banks is that there should be transparency, which will induce depositors or regulators to hold down bank leverage before insolvency is reached. This is contrary to the view that opaqueness is a virtue of banks because it lessens panic.

Limited Dependent Variables and Discrete Choice Modelling  

Badi h. baltagi.

Limited dependent variables considers regression models where the dependent variable takes limited values like zero and one for binary choice mowedels, or a multinomial model where there is a few choices like modes of transportation, for example, bus, train, or a car. Binary choice examples in economics include a woman’s decision to participate in the labor force, or a worker’s decision to join a union. Other examples include whether a consumer defaults on a loan or a credit card, or whether they purchase a house or a car. This qualitative variable is recoded as one if the female participates in the labor force (or the consumer defaults on a loan) and zero if she does not participate (or the consumer does not default on the loan). Least squares using a binary choice model is inferior to logit or probit regressions. When the dependent variable is a fraction or proportion, inverse logit regressions are appropriate as well as fractional logit quasi-maximum likelihood. An example of the inverse logit regression is the effect of beer tax on reducing motor vehicle fatality rates from drunken driving. The fractional logit quasi-maximum likelihood is illustrated using an equation explaining the proportion of participants in a pension plan using firm data. The probit regression is illustrated with a fertility empirical example, showing that parental preferences for a mixed sibling-sex composition in developed countries has a significant and positive effect on the probability of having an additional child. Multinomial choice models where the number of choices is more than 2, like, bond ratings in Finance, may have a natural ordering. Another example is the response to an opinion survey which could vary from strongly agree to strongly disagree. Alternatively, this choice may not have a natural ordering like the choice of occupation or modes of transportation. The Censored regression model is motivated with estimating the expenditures on cars or estimating the amount of mortgage lending. In this case, the observations are censored because we observe the expenditures on a car (or the mortgage amount) only if the car is bought or the mortgage approved. In studying poverty, we exclude the rich from our sample. In this case, the sample is not random. Applying least squares to the truncated sample leads to biased and inconsistent results. This differs from censoring. In the latter case, no data is excluded. In fact, we observe the characteristics of all mortgage applicants even those that do not actually get their mortgage approved. Selection bias occurs when the sample is not randomly drawn. This is illustrated with a labor participating equation (the selection equation) and an earnings equation, where earnings are observed only if the worker participates in the labor force, otherwise it is zero. Extensions to panel data limited dependent variable models are also discussed and empirical examples given.

Machine Learning Econometrics: Bayesian Algorithms and Methods  

Dimitris korobilis and davide pettenuzzo.

Bayesian inference in economics is primarily perceived as a methodology for cases where the data are short, that is, not informative enough in order to be able to obtain reliable econometric estimates of quantities of interest. In these cases, prior beliefs, such as the experience of the decision-maker or results from economic theory, can be explicitly incorporated to the econometric estimation problem and enhance the desired solution. In contrast, in fields such as computing science and signal processing, Bayesian inference and computation have long been used for tackling challenges associated with ultra high-dimensional data. Such fields have developed several novel Bayesian algorithms that have gradually been established in mainstream statistics, and they now have a prominent position in machine learning applications in numerous disciplines. While traditional Bayesian algorithms are powerful enough to allow for estimation of very complex problems (for instance, nonlinear dynamic stochastic general equilibrium models), they are not able to cope computationally with the demands of rapidly increasing economic data sets. Bayesian machine learning algorithms are able to provide rigorous and computationally feasible solutions to various high-dimensional econometric problems, thus supporting modern decision-making in a timely manner.

Preferential Trade Agreements: Recent Theoretical and Empirical Developments  

James lake and pravin krishna.

In recent decades, there has been a dramatic proliferation of preferential trade agreements (PTAs) between countries that, while legal, contradict the non-discrimination principle of the world trade system. This raises various issues, both theoretical and empirical, regarding the evolution of trade policy within the world trade system and the welfare implications for PTA members and non-members. The survey starts with the Kemp-Wan-Ohyama and Panagariya-Krishna analyses in the literature that theoretically show PTAs can always be constructed so that they (weakly) increase the welfare of members and non-members. Considerable attention is then devoted to recent developments on the interaction between PTAs and multilateral trade liberalization, focusing on two key incentives: an “exclusion incentive” of PTA members and a “free riding incentive” of PTA non-members. While the baseline presumption one should have in mind is that these incentives lead PTAs to inhibit the ultimate degree of global trade liberalization, this presumption can be overturned when dynamic considerations are taken into account or when countries can negotiate the degree of multilateral liberalization rather than facing a binary choice over global free trade. Promising areas for pushing this theoretical literature forward include the growing use of quantitative trade models, incorporating rules of origin and global value chains, modeling the issues surrounding “mega-regional” agreements, and modelling the possibility of exit from PTAs. Empirical evidence in the literature is mixed regarding whether PTAs lead to trade diversion or trade creation, whether PTAs have significant adverse effects on non-member terms-of-trade, whether PTAs lead members to lower external tariffs on non-members, and the role of PTAs in facilitating deep integration among members.

Printed from Oxford Research Encyclopedias, Economics and Finance. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 03 June 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [66.249.64.20|81.177.182.159]
  • 81.177.182.159

New Economic Theories

  • Published: April 1998
  • Volume 11 , pages 365–381, ( 1998 )

Cite this article

new economic theories assignment

  • Carlo Carraro 1  

230 Accesses

6 Citations

Explore all metrics

This paper analyses some of the most important spillovers of recent developments of economic theory into environmental economics. Attention is given to the anlaysis of sustainable economic development paths, where endogenous growth models are used; the implications of environmental dumping and more generally of policies concerning global environmental issues, where new trade theories are very useful; and, the effectiveness of environmental policy instruments when markets are imperfectly competitive, where industrial organisation theory is employed. The paper does not only note recent developments in environmental economics, but also relates these to the previous environmental economics literature. Thus, it can be assessed whether new results actually improve our knowledge of crucial economic and environmental issues.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

new economic theories assignment

Environmental policy, tax, and the target of sustainable development

new economic theories assignment

Environmental Economics, the Bioeconomy and the Role of Government

new economic theories assignment

Is environmental innovation the key to addressing the dual economic and sustainability challenge of the Italian economy?

Barrett, S. (1994), ‘Strategic Environmental Policy and International Trade’, Journal of Public Economics 54 , 325–338.

Article   Google Scholar  

Baumol, W. and W. Oates (1988), The Theory of Environmental Policy . Cambridge: Cambridge University Press.

Google Scholar  

Beltratti, A. (1996), Sustainability of Growth: Reflections on Economic Models . Dordrecht: Kluwer Academic Publishers.

Beltratti, A., G. Chichilniski and G. Heal (1993), ‘Sustainable Growth and the Green Golden Rule’. FEEM Discussion Paper, 61.93, Milan.

Beltratti, A. (1997), ‘Growth with Natural and Environmental Resources’, in C. Carraro and D. Siniscalco, eds., New Directions in the Economic Theory of the Environment . Cambridge: Cambridge University Press.

Beckman, M. and J. Thisse (1986), ‘The Location of Production Activities’, in P. Nijkamp, ed., Handbook of Regional and Urban Economics . Amsterdam: North-Holland.

Boetti, M., M. Botteon, C. Carraro and A. Soubeyran (1997), ‘On the Effects of Industrial, Trade and Environmental Policies on the Location Choices of Firms’, Revue d ’ Economie Industrielle , forthcoming.

Bovenberg, L. and S. Smulders (1995), ‘Environmental Quality and Pollution Augmenting Technological Change in a Two Sector Endogenous Growth Model’, Journal of Public Economics 57 , 369–391.

Brander, J. and B. Spencer (1985), ‘Export Subsidies and International Market Share Rivalry’, Journal of International Economics 18 , 83–100.

Buchanan, J. M. (1969), ‘External Diseconomies, Corrective Taxes and Market Structure’, American Economic Review 59 , 174–177.

Carraro, C. (1997) ‘Induced Technical Change in Environmental Models: Theoretical Results and Implementations’, presented at the International Workshop on ‘Induced Technological Change and the Environment’; IIASA, 26–27 June 1997.

Carraro, C. and D. Siniscalco (1997), New Directions in the Economic Theory of the Environment . Cambridge: Cambridge University Press.

Carraro, C. and A. Soubeyran (1996a), ‘Environmental Taxation, Market Share and Profits in Oligopoly’, in Carraro, C., Y. Katsoulacos and A. Xepapadeas, eds., Environmental Policy and Market Structure . Dordrecht: Kluwer Academic Publishers.

Carraro, C. and A. Soubeyran (1996b), ‘Environmental Feedbacks and Optimal Taxation in Oligopoly’, in A. Xepapadeas, ed., Economic Policy for the Environment and Natural Resources . Cheltenham: Edward Elgar.

Carraro, C. and A. Soubeyran (1996c), ‘Environmental Policy and the Choice of Production Technology’, in Carraro, C., Y. Katsoulacos and A. Xepapadeas, eds., Environmental Policy and Market Structure . Dordrecht: Kluwer Academic Publishers.

Carraro, C. and A. Soubeyran (1997), ‘R&D Cooperation, Innovation Spillovers and Firm Location in a Model of Environmental Policy’, in E. Petrakis, E. Sartzetakis and A. Xepapadeas, eds., Environmental Regulation and Market Structure . Cheltenham: Edward Elgar.

Carraro, C. and G. Topa (1994), ‘Should Environmental Innovation Policy Be Internationally Coordinated’, in C. Carraro, ed., Trade, Innovation, Environment . Dordrecht: Kluwer Academic Publishers.

Carraro, C. and G. Topa (1995), ‘Environmental Taxation and Innovation’, in C. Carraro and J. Filar, eds., Control and Game-Theoretic Models of the Environment . New York: Birckauser.

Conrad, K. (1993), ‘Taxes and Subsidies for Pollution Intensive Industries as Trade Policy’, Journal of Environmental Economics and Management 25 , 121–135.

Copeland, B. (1994), ‘International Trade and the Environment: Policy Reform in a Polluted Small Open Economy’, Journal of Environmental Economics and Management 26 , 44–65.

DArge, R. C. and K. C. Kogiku (1973), ‘Economic Growth and the Environment’, Review of Economic Studies 40 , 61–77.

Ethier, W. (1986), ‘The Multinational Firm’, Quarterly Journal of Economics 101 , 805–833.

Gradus, R. and S. Smulders (1993), ‘The Trade-off Between Environmental Care and Long-Term Growth – Pollution in Three Prototype Growth Models’, Journal of Economcis 58 , 25–51.

Grossman, G. and E. Helpman (1991), Innovation and Growth . Cambridge: MIT Press.

Heal, G. (1982), ‘The Use of Common Property Resources’, in V. K. Smith and J. V. Krutilla, eds., Explorations in Natural Resource Economics . Baltimore: John Hopkins University Press.

Heal, G. (1995), ‘Interpreting Sustainability’, FEEM Discussion Paper 1.95, Milan.

Helpman, E. (1985), ‘Multinational Corporations and Trade Structure’, Review of Economic Studies 52 , 443–458.

Helpman, E. and P. Krugman (1985), Market Structure and Foreign Trade . Cambridge: MIT Press.

Helpman, E. and P. Krugman (1989), Trade Policy and Market Structure . Cambridge: MIT Press.

Hoel, M. (1994), ‘Environmental Policy as a Game between Governments when Plant Locations are Endogenous’, presented at the CEPR workshop on ‘Environmental Policy, International Agreements and International Trade’, London, 11–12 November 1994.

Hung, V., P. Chang and K. Blackburn (1994), ‘Endogenous Growth, Environment and R&D’, in C. Carraro, ed., Trade, Innovation, Environment . Dordrecht: Kluwer Academic Publishers.

Jung, C., K. Krutilla and R. Boyd (1996), ‘Incentives for Advanced Pollution Abatement Technology at the Industry Level: An Evaluation of Policy Alternatives’, Journal of Environmental Economics and Management 30 , 95–111.

Kamien, M. I. and N. L. Schwartz (1982), ‘The Role of Common Property Resources in Optimal Planning odes with Exhaustible Resources’, in V. K. Smith and J. V. Krutilla, eds., Explorations in Natural Resource Economics . Baltimore: John Hopkins University Press.

Katsoulacos, Y. and A. Xepapadeas (1994), ‘Pigouvian Taxes under Oligopoly’, mimeo. Athens University.

Katsoulacos, Y. and A. Xepapadeas (1995), ‘Environmental Policy under Oligopoly with Endogenous Market Structure’, Scandinavian Journal of Economics 97 , 411–420.

Katsoulacos, Y. and A. Xepapadeas (1996a), ‘Emission Taxes and Market Structure’, in Carraro, C., Y. Katsoulacos and A. Xepapadeas, eds., Environmental Policy and Market Structure . Dordrecht: Kluwer Academic Publishers.

Katsoulacos, Y. and A. Xepapadeas (1996b), ‘Environmental Innovation, Spillovers and Optimal Policy Rules’, in Carraro, C., Y. Katsoulacos and A. Xepapadeas, eds., Environmental Policy and Market Structure . Dordrecht: Kluwer Academic Publishers.

Krugman, P. (1991), Geography and International Trade . Cambridge: MIT Press.

Laffont, J. J. (1994), ‘Regulation of Pollution with Asymmetric Information’, in C. Dosi and T. Tomasi, eds., Non Point Source Pollution Regulation: Issues and Analysis . Dordrecht: Kluwer Academic Publishers.

Laffont, J. J. and J. Tirole (1996), ‘A Note on Environmental Innovation’, Journal of Public Economics .

Levinson, J. (1996), ‘Environmental Policy and Plant Location’, Journal of Public Economics 62 , 1–18.

Lucas, R. (1988), ‘On the Mechanics of Economic Development’, Journal of Monetary Economics 22 , 3–42.

Malueg, D. A. (1990), ‘Welfare Consequences of Emission Trading Credit Programs’, Journal of Environmental Economics and Management 18 , 66–77.

Markusen, J. (1984), ‘Multinationals, Multi-Plant Economies and the Gains from Trade’, Journal of International Economics 16 , 205–226.

Markusen, J. (1996), ‘The Economic Theory of Trade and Firm Location’, mimeo. FEEM, Milan.

Markusen, J. R., E. R. Morey and N. Olewiler (1993), ‘Environmental Policy when Market Structure and Plant Locations are Endogenous’, Journal of Environmental Economics and Management 24 , 69–86.

Markusen, J. R., E. R. Morey and N. Olewiler (1995), ‘Competition in Regional Environmental Policies with Endogenous Plant Location Decisions’, Journal of Public Economics 56 , 55–77.

Michel, P. (1993), ‘Pollution and Growth towards the Ecological Paradise’, FEEM Discussion Paper 80.93, Milan.

Michel, P. and G. Rotillon (1992). ‘Pollution Disutility and Endogenous Growth’, mimeo. University of Paris I.

Misiolek, S. W. (1980), ‘Effluent Taxation in Monopoly Markets’, Journal of Environmental Economics and Management 7 , 103–107.

Motta, M. and J. Thisse (1994), ‘Does Environmental Dumping Lead to Delocation?’ European Economic Review 38 , 555–564.

Musu. I. (1995), ‘Transitional Dynamics to Optimal Sustainable Growth’, FEEM Discussion Paper 50.95, Milan.

Oates, W. E. and D. L. Strassman (1984), ‘Effluent Fees and Market Structure’, Journal of Public Economics 24 , 29–46.

Porter, M. (1991), ‘America's Green Strategy’, Scientific American , 168.

Rauscher, M. (1994), ‘On Ecological Dumping’, Oxford Economic Papers 46 , 822–840.

Rauscher, M. (1995), ‘Environmental Regulation and the Location of Polluting Industries’, International Tax and Public Finance 2 , 229–244.

Rauscher, M. (1997), ‘Environmental Regulation and International Capital Allocation’, in C. Carraro and D. Siniscalco, eds., New Directions in the Economic Theory of the Environment . Cambridge: Cambridge University Press.

Reinganum, J. F. (1989), ‘The Timing of Innovation: Research, Development and Diffusion’, in R. Schmalensee and R. D. Willig, eds., Handbook of Industrial Organisation . Amsterdam: North-Holland.

Requate, T. (1995), ‘Incentives to Adopt New Technologies under Different Pollution-Control Policies’, International Tax and Public Finance 2 , 295–317.

Roberts, M. J. and M. Spencer (1976), ‘Effluent Charges and Licenses under Uncertainty’, Journal of Public Economics 5 , 193–208.

Romer, P. (1994), ‘The Origins of Endogenous Growth’, Journal of Economic Perspectives 8 , 5–22.

Sala i Martin, X. (1990), ‘Lectures Notes on Economic Growth’, NBER Working Papers 3563 and 3564, Cambridge.

Siebert, H. (1985), ‘Spatial Aspects of Environmental Economics’, in A. V. Kneese and J. L. Sweeney, eds., Handbook of Natural Resources and Environmental Economics . Amsterdam: North-Holland, pp. 125–164.

Ulph, A. (1994), ‘Environmental Policy, Plant Location, and Government Protection’, in C. Carraro, ed., Trade, Innovation, Environment . Dordrecht: Kluwer Academic Publishers.

Ulph, A. (1997), ‘Environmental Policy, Strategic Trade and Innovation’, in C. Carraro and D. Siniscalco, eds., New Directions in the Economic Theory of the Environment . Cambridge: Cambridge University Press.

Ulph, D. (1994), ‘Strategic Innovation and Strategic Environmental Policy’, in C. Carraro, ed., Trade, Innovation, Environment . Dordrecht: Kluwer Academic Publishers.

Ulph, D. (1997), ‘Environmental Policy and Technological Innovation’, in C. Carraro and D. Siniscalco, eds., New Directions in the Economic Theory of the Environment . Cambridge: Cambridge University Press.

Ulph, A. and D. Ulph (1996), ‘Trade, Strategic Innovation and Strategic Environmental Policy – A General Analysis’, in C. Carraro, Y. Katsoulacos and A. Xepapadeas, eds., Environmental Policy and Market Structure . Dordrecht: Kluwer Academic Publishers.

Ulph, A. and L. Valentini (1996), ‘Plant Location and Strategic Environmental Policy with Intersectoral Linkages’, presented at the CEPR workshop on ‘Environmental Policy, International Agreements and International Trade’, London, 11–12 November 1994.

Van Egteren, H. and M. Weber (1996), ‘Marketable Permits, Market Power and Cheating’, Journal of Environmental Economics and Management 30 , 161–173.

Venables, A. (1996), ‘Equilibrium Location of Vertically Integrated Industries’, International Economic Review 37 , 341–359.

Verdier, T. (1995), ‘Environmental Pollution and Endogenous Growth: a Comparison between Emission Taxes and Technological Standards’, in C. Carraro and J. Filar, eds., Control and Game-Theoretic Models of the Environment . New York: Birckauser.

Von der Fehr, N. (1993), ‘Tradable Emission Rights and Strategic Interactions’, Environmental and Resource Economics 3 , 129–151.

Xepapadeas, A. (1997a), ‘Economic Development and Environmental Traps’, Structural Change and Economic Dynamics , forthcoming.

Xepapadeas, A. (1997b), Advanced Principles in Environmental Policy . Cheltenham: Edward Elgar.

Download references

Author information

Authors and affiliations.

University of Venice and FEEM, S. Giobbe 873, Venice, Italy

Carlo Carraro

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

About this article

Carraro, C. New Economic Theories. Environ Resource Econ 11 , 365–381 (1998). https://doi.org/10.1023/A:1008204826571

Download citation

Issue Date : April 1998

DOI : https://doi.org/10.1023/A:1008204826571

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • economic theory
  • endogenous growth
  • environmental innovation
  • imperfect competition
  • international trade
  • Find a journal
  • Publish with us
  • Track your research

National Academies Press: OpenBook

Transforming Post-Communist Political Economies (1998)

Chapter: 2 rethinking the theory of economic policy: some implications of the new institutionalism, 2 rethinking the theory of economic policy: some implications of the new institutionalism.

Thráinn Eggertsson

INTRODUCTION

The early postwar domination of welfare economics (Samuelson, 1947: Ch. 8; Bergson, 1938), the Keynesian revolution, and the new field of development economics (Kindleberger, 1958) ushered in an age of excessive expectations for the potency of economic policy. To organize their thoughts about the contribution of economics to policy, and confident of their capacity to control social systems, many economists relied on a popular framework, the theory of economic policy (Tinbergen, 1956). In the 1970s, this excessive optimism changed as policy failures and a clearer recognition of the role of private incentives buried naive hopes of fine-tuning the economic system or individual markets (Lucas, 1976; Posner, 1986:Part III). Events also diminished early hopes that development economics would provide strategies for rapid transition in the Third World (Hirschman, 1981). In economics, a new emphasis on information scarcity suggested that transaction costs seriously limit effective social engineering and complicate economic organization (Furubotn and Richter, 1993; Kreps, 1990; Milgrom and Roberts, 1992; North, 1990; Stiglitz, 1994; Williamson, 1985).

Growing pessimism about traditional approaches produced neither systematic reevaluation of development strategies nor a new consensus on the appropriate scope for public policy. Economists offer conflicting explanations of economic successes and failures among Third World countries, while the Eastern European revolution of 1989 took social science unawares when it required guidance for rapid transition to markets. Mainstream economics has lacked a general theory of economic systems and structural change. In recent

years, however, a new theory of institutions based on the economics of property rights and transaction costs has earned a measure of recognition among economists.

Although institutional analysis could potentially complement standard macro- and microeconomic theory in the design of policies for economic development, it has yet to develop a strong policy orientation. 1 This chapter introduces institutional analysis to the old theory of economic policy—to its policy models and its instruments, targets, and policy measures—in the hope that the new institutionalism will reveal its policy implications when viewed against the background of the traditional policy world. More particularly, the chapter explores the ways information scarcity affects policies aimed at social transformation.

The next section briefly summarizes the old theory of economic policy, associated with the Dutch economist Jan Tinbergen (1956). This is followed by an examination of three policy issues that were not a central concern of mainstream economics in the early post-war period: (1) the requirements of structural policy, (2) the need to extend the policy model, and (3) the implications of information scarcity. The chapter then presents a public policy view of the new institutionalism; problems of incomplete data and control and of incomplete models and decisions suggest an intricate policy world. Next is a discussion of the policy determinacy implicit in rational-choice political economy. The final section looks at the general policy implications of institutional analysis for major social transformations, such as those attempted in Eastern Europe and in the Third World.

THE OLD THEORY OF ECONOMIC POLICY

In a perceptive discussion of the theory of economic policy, Hansen ( 1963) emphasizes the central role of models in policy formulation. As almost all policy aims at influencing economic outcomes or processes, policymakers—politicians, administrators, social scientists, voters, or rulers—must rely on a model, or a description of the economic system, which sometimes is little more than a rough qualitative picture (Hansen, 1963:3). It is argued below that the information assumptions of institutional analysis assume that actors employ incomplete and variable models of their environments when attempting to advance their public or private policies.

A formal model of an economic system, such as a firm, a market, or an economy, can be written in the following general way:

new economic theories assignment

In equation (1), x l , … , x n are n endogenous variables, and a 1 , … , a m are m exogenous variables, lagged variables, or parameters, some of which (for instance, exchange rates, tax rates, base money, price ceilings, import restrictions, plan indicators, or agricultural production quotas) are controlled by the policy actor (Hansen, 1963:5). Note a subtle distinction here in the meaning of exogeny. All the exogenous variables in equation (1) are exogenous to the actors of the social system that the model attempts to describe. The policymakers are distinct from the social system and control some of the exogenous variables, whereas other exogenous variables constrain their actions.

The policy model describes the choices open to policymakers: their opportunities to reach targets (desired values of endogenous variables) by applying instruments (exogenous variables they control). Policy targets (goals) are derived from the preferences of policymakers. The structure of the policy model prescribes what target values are attainable and how they are attained. Policy targets may be absolute, or the policymaker may weigh target variables together in a target preference function, T ( x 1 , … , x n ) . 2 Economic policy uses policy instruments either to attain absolute targets or to maximize the target preference function. When targets are fixed (or when target preference functions are maximized without limitations), basic logic suggests two well-known rules of thumb: (1) in general, "the number of instruments should be (at least) equal to the number of targets," and (2) individual instruments should not be assigned to specific targets, but all instruments should be coordinated and directed toward the set of targets (Hansen, 1963:7).

Finally, the structure of the policy model has important implications for policy. It describes the interrelationships among the variables in the model (equation [1]) and determines whether the model can be divided into autonomous departments. Following Simon (1953), all endogenous variables and instruments in a policy model can be arranged according to causal ordering from the first order to the highest, n th, order. Instruments of the n th order influence targets of the nth order without affecting lower orders of the system. However, the use of first-order instruments has repercussions not only for first-order target variables, but also for endogenous variables at higher levels, possibly throughout the system (Hansen, 1963:18-22).

NEW PERSPECTIVES AND THE OLD THEORY

The Tinbergen (1956) framework continues to be an essential part of our mental apparatus. When prescribing policy, economists think, explicitly or implicitly, in terms of instruments and target preference functions, and the notion of a model intervening between preferences and policy remains relevant. A new outlook in social science, however, has weakened economists' belief in their ability to prescribe economic outcomes and mold economic systems. We turn now to three related issues: (1) the requirements of structural policy, (2) the need to extend the policy model, and (3) the implications of information scarcity.

The Requirements of Structural Policy

The old theory of economic policy distinguished quantitative policy from qualitative or structural policy. Quantitative policy takes as given the basic structure of the economic system (or subsystem), i.e., equation (1), and seeks to manipulate existing economic relationships toward some particular end. Until recently, the findings of mainstream economic theory were relevant primarily for quantitative policy, because the theory made few attempts to endogenize or explain (parts of) the economic system. Structural policy, on the other hand, seeks to change the structure of equation (1), and sometimes to add new variables or new relationships. The (immediate) goal is not to achieve a new value for a target variable in the quantitative policy model, but to create a new relationship between (new) instruments and targets.

The discussion in this chapter emphasizes the distinction between quantitative and structural policy, although, as we shall see, the new emphasis on information and incentives has blurred this distinction. Economists have recognized that over time, what were assumed to be quantitative policy initiatives (e.g., rent control, increases in tax rates, or new welfare benefits) have often altered the structure of equation (1). However, it is useful conceptually to distinguish fundamental system transformations from behavioral responses to changes in relative prices within a given system.

Structural policy obviously invites new quantitative policy (and a new quantitative policy model) because the new system must be managed. Furthermore, if the transition to the target structure is slow, appropriate quantitative policy is required to ensure the orderly operation of the system during the transition period (McKinnon, 1991).

Unlike quantitative policy, structural policy cannot be employed effectively without a theory of institutions and institutional change. Policymakers can conserve their mental energy and use relatively simple models, however, as long as low-order instruments can generate spontaneous adjustments in higher-order variables—i.e., in critical institutions—throughout the system. For instance, policymakers would not require complex policy models to guide

the transition to markets in Russia and Eastern Europe if they believed that appropriate market institutions and organizations will emerge autonomously once "prices are set free" (Murrell, 1995). In the final analysis, the structure of the social system is an empirical question, but as a rule of thumb, policymakers in a world of scarce information usually do well to search for powerful low-order instruments.

The Need to Extend the Policy Model

The old theory, which was concerned primarily with quantitative microand macroeconomic relationships, assumed that the target preference functions of policymakers coincide with the normative standards of economic theory. Traditional policy analysis usually ignored the incentives and behavior of political actors or the influence of political processes on targets for growth, stability, pollution abatement, regulation, or the division of investment funds among sectors. Macroeconomics was concerned with stability and growth, while microeconomics focused on allocative efficiency, assuming that policymakers shared these goals.

In recent decades, various scholars have extended the policy model to incorporate endogenous politicians, and analyses of the latter's policies now appear in the literature. Fields such as public choice, political economy, and political macroeconomics attempt to endogenize the choice of targets and instruments, and to provide the elements for a positive theory of structural change (Mueller, 1989; Alt and Shepsle, 1990; Hettich and Winer, 1993; Alesina, 1991).

Pure quantitative economic policy typically (though not always) leaves the political equilibrium intact, particularly when the policy achieves the intended results. In political equilibrium, those in power tend to agree on traditional normative economic goals, such as stability, growth, and allocative efficiency, within the existing institutional framework. 3 Of course, the prevailing institutional framework may leave little or no scope for economic progress. In a relatively stable world, the role of those who control and coordinate key policy instruments is usually well known and clearly established. Generally, there is little doubt about the policy sphere of actors such as the central bank, the finance ministry, the environmental protection agency, or the central planning bureau. Policy analysts have relatively little need for elaborate positive political theory to identify the set of politically sustainable policies. 4

Structural policy, on the other hand, is frequently associated with political instability. Substantial structural measures usually alter the distribution of wealth and power and often emerge in times of political upheaval. The choice of new economic structures frequently involves political disputes and struggles that render the control and coordination of policy instruments uncertain, especially over time. To formulate viable economic policy in an unstable environment and minimize the likelihood of policy reversals, the analyst needs to model interactions among economic, political, and social forces. The need to expand the policy model to incorporate this interaction is particularly obvious when policy experts seek strategies for instituting economic measures that (at least in the short run) have tenuous support among the general public, those in power, and those seeking power. Some policy analysts, for instance, recommended shock treatments or big-bang measures in part because strong measures are likely to overwhelm a disillusioned public and unreliable politicians. They are also more likely to create irreversible structural change (Åslund, 1995). 5

Implications of Information Scarcity

The last decades of the twentieth century have seen increasingly explicit concern with the role of information in social systems and in social science (Coase, 1960; Diamond and Rothschild, 1989; Hirshleifer and Riley, 1979, 1992; Stiglitz, 1994). The very concept of a social system operating with full information staggers the imagination, yet the impression that early postwar neoclassical economics assumed full information is widespread.

A theory of social systems that explicitly models the information environment of its participants confronts three types of information problems: (1) data are scarce, (2) actors economize on scarce information by formulating simplifying models of their environments (as do scientists), and (3) actors have limited capacity to absorb and process data (learn and make decisions). These three issues can be characterized as incomplete data, incomplete models, and incomplete decisions, respectively. The information revolution that has taken place in the social sciences during the last few decades has focused on problems of incomplete data, although the notion of incomplete models and decisions has received some attention. 6 Yet it can be argued that a new theory of economic policy must recognize all three information problems. It must also determine their impact on public policy and the interaction between private and public policy.

The old theory of macroeconomic policy, or rather several scholars in that field, did acknowledge that incomplete data and models can undermine the efforts of policymakers (Hansen, 1963:31-36; Friedman, 1961). In particular, it was argued that various lags of uncertain length can pervert the timing of corrective measures and even turn them into destabilizing impulses. 7 In the 1970s, when macroeconomics acquired formal microfoundations, the theory even attempted to incorporate the interplay between public and private policy models. The early rational expectations school assumed that economic actors would be able to absorb the policy models used by the authorities, thereby enabling the actors to second-guess the authorities' intentions and undertake actions that would undermine economic policy. Random policy measures, however, would not produce this effect (Lucas, 1976). 8

Similarly, the interactions between public and private policy models in individual markets are implicit in the work of Steven N. S. Cheung, who pioneered the economics of contracts (Cheung, 1969). For instance, in his studies of rent control in Hong Kong, Cheung recognized that public policy models were incomplete in that regulators lacked knowledge of how economic actors would establish a new equilibrium in response to official price ceilings in rental markets (Cheung, 1975, 1976). As rent control constrained the price mechanism, the new equilibrium (and private policy) involved various nonprice margins, including the transformation of residential buildings into unregulated warehouses and (premature) demolition and rebuilding. Cheung's empirical work demonstrates, however, that skillful regulators are often able to use trial and error to acquire knowledge about private models, which they may then use to revise the public policy model, design more effective policy measures, avoid unwanted side effects, and eventually come tolerably close to their policy targets.

INSTITUTIONS, INFORMATION, AND CONTROL

We now turn to a discussion of the general implications of the new economics of institutions for the theory of economic policy.

In its initial phase, as is common for new fields of scholarship, the economics of institutions has emphasized explanation, empirical work, and policy analysis—in that order. Most studies, whether examining institutional change or the economic consequences of alternative institutions, are concerned with

the link between institutions and wealth or the social dividend. Therefore, wealth is frequently the (implicit) policy target in these studies. The distribution of power and wealth usually enters into these works as a determinant of economic outcomes or as an important force propelling institutional change.

Incomplete Data and Control

Information and incentives are the driving forces behind theories of social systems that rely on methodological individualism. Institutions are of critical importance for economic performance because they affect both incentives and the cost of information. The economics of institutions derives the structure of the policy model (our equation [1]) from the systems underlying institutions or, in other words, from the rules that, in the language of game theory, affect the expected payoffs of the actors. 9 Therefore, a change in the formal or informal rules that leaves all payoff equations unaffected does not count as institutional change. Institutions emerge from the fusion of social customs and habits; formal rules and regulations; and various enforcement mechanisms, including internalized social norms. The primary weakness of the economics of institutions is its limited understanding of this amalgam of formal and informal rules and their attendant enforcement mechanisms. Most studies ignore social values, while others treat them either as constants or as exogenous variables. 10

In the economics of institutions, the notion of information scarcity usually enters into the analysis through the assumption of incomplete data, but it is the union of incomplete data and what may be called the control problem that gives the new institutionalism its distinctive flavor. 11 Simply stated, costly measurement is responsible for incomplete data. Incomplete data raise the cost of verifying quality and monitoring behavior. This draws attention to one of the central complexities of economic life: commodities and behavior usually have multiple valuable dimensions or margins. Rising marginal cost in acquiring data suggests that actors are usually unable to control fully all margins of the resources over which they have nominal control. Therefore, in-

complete control is a general condition, and, as economics first recognized in the case of open-access fisheries, lack of control generates incentives that can lead actors to dissipate wealth (Barzel, 1989).

The control actors exercise over resources can derive from both external and internal sources: institutions, which represent socially assigned control, are external sources; the various measures actors take themselves, such as monitoring, fencing and locking up valuables, and contracting with other actors, are internal sources. In the literature, the costs of establishing and maintaining control over resources both in exchange and in use are commonly known as transaction costs. High transaction costs act as barriers to productive activities. The policy lesson is clear: structural policy that seeks to increase the capacity of an economic system in order to generate wealth must design institutions that lower transaction costs (North, 1990).

An increase in the social dividend has the potential to benefit all members of a social system, but imperfect institutions (imperfect in terms of the wealth criterion) often persist. To explain imperfect institutions, the new institutionalism typically looks to the political domain and uses high transaction costs in the political process to explain why actors are unable to agree on institutions that would be more conducive to economic growth (Bates, 1990; Moe, 1990; Weingast, 1995). The literature also recognizes that many social institutions and structures that facilitate economic growth emerge spontaneously and not through design. The role of shared social values in economic growth is of particular interest (North, 1990). Scholars in the rational choice tradition have had little success in explaining the emergence and evolution of social values, and it is not clear how policymakers could target social values. Consequently, the role of norms and customs in structural policy is ambiguous. A poor society that attempts to create incentives and information environments for economic growth by launching institutional change can hope for rapid success (1) if its underlying social values are consistent with the new institutions of growth, (2) if social values are malleable and adjust quickly to other aspects of the institutional environment, and (3) if the importance of social values in lowering transaction costs has been overrated. We return to these issues in the final section of the chapter. 12

Incomplete Models and Incomplete Decisions

In a world of scarce information, those who seek to accomplish structural

change must recognize that they are dealing with incomplete, competing models. Although the theory of economic policy has always been stated in terms of policy models, institutional analysis and the information perspective suggest that additional elements are needed:

When attempting to advance their private goals, the subjects of public policy (economic actors, households) rely on private policy models of the physical world, the social system, and the moral order.

Successful structural policy must allow for interactions between public policy models and private models, and revisions to both in response to new data.

An important aspect of public policy is to provide the subjects of policy—actors whose behavior the policymaker seeks to change—with the information needed to revise their private models. This will assist in coordinating models at different levels. 13

When we recognize that revision of models (learning) is often critical for the success of public policy, the revision process itself becomes of great practical interest. Rational-choice social science relies on rules drawn from logic, mathematics, and probability theory, and assumes that social actors use the universal logical rules of science for updating their beliefs or models. Even when this approach treats the origins of private models as exogenous, the assumption that actors use the general rules of science to update their models (for instance, Bayes' rule) implies that the models originate as purely logical or statistical interpretations of available data. In general, the logical approach cannot explain creative and selective interpretation of available data.

For many purposes, however, scholars can use standard logic to explain how actors revise their models and behavior. For instance, in a recent study, Bates and Weingast (1995) investigate revolutionary transformations in Zambia (movement to democracy) and in the former Yugoslavia (eruption of violent communal conflict) in terms of the updating of shared private beliefs (models). Bates and Weingast model interactions among the players as signaling games, where Bayes' rule is used to update models when new data (signals) become available. The paper demonstrates how a policymaker (Milosevic) can bring about a major change in social systems by manipulating signals.

Some scholars question whether actors use standard mathematical logic to update their models. Cognitive psychology and evolutionary biology argue that the human mind relies on ''a large and heterogeneous network of functionally specialized computational devices," rather than functioning as a general-purpose computer (Cosmides and Tooby, 1994:329; Tooby and Cosmides, 1992). A union of evolutionary psychology and economics "might be able to create a science of preferences" (Cosmides and Tooby, 1994:331) and im-

prove our understanding of how actors model their environment, especially the moral order.

In sum, a new theory of structural policy must recognize variable and incomplete models at different levels and allow for interactions between public policy models and private models. In its present state, social science is equipped to do this only on the basis of the general-purpose rational methods of science.

A DIGRESSION ON POLICY DETERMINACY

Rational-choice social science, which assumes that all actors optimize under constraints, implicitly suggests a high degree of policy determinacy. As long as neither social nor political actors were seen as rigorous optimizers, analysts believed there still was considerable scope for reforms. However, when the policy model was expanded to include political and social activity, and optimization under constraints was assumed throughout the social system, the policy choice set seemed to shrink and approach an empty set. This meant that structural policy appeared to have zero degrees of freedom.

The notion of incomplete data, incomplete models, and incomplete decisions changes this picture and expands the policy choice set. The changing fortunes of the Nordic welfare state illustrate this point. Lindbeck (1994, 1995) discusses the ways in which welfare state policies created not only a virtuous circle of benefits, but also an unexpected, undesired, and vicious circle of problems. The problems were associated with delayed changes in the behavior of households, interest groups, public-sector administrators, and politicians. These changes in behavior affected work effort, labor force participation, savings, asset choice, entrepreneurship, and short-term macroeconomic stability, and thereby shrank the tax base of the welfare state.

In analyzing these changes in behavior, Lindbeck recognizes the importance of incomplete data (for instance, delays in obtaining information about new welfare programs), but he puts the greatest weight on what might be called incomplete and variable private policy models. As the welfare system unfolded, the various types of actors, from households to politicians, revised their policy models. Lindbeck (1995) argues that the actors did more than revise their positive models of the social system and adjust their strategies for a new environment; they also revised their models of the moral order and updated their shared social values.

Lindbeck's analysis suggests, therefore, that we need to examine social policies in the Nordic welfare state in terms of incomplete public and private models. At the highest level, public policy models (presumably) did not allow for delayed regime changes in various structural relationships with the system as a whole (for instance, in labor supply or in savings ratios). This policy failure at the top is related to a misreading of private policy models, in particu-

lar a failure to recognize that actors will revise their models, targets, and policies. A revision of private policy models may change not only individual behavior, but also the structure and performance of organizations (households, social networks, firms, public agencies) that are the engine of social action (North, 1990). In Nordic social networks, the interactive revision of private models apparently first lowered the cost (stigma) of being a bona fide welfare recipient, and then the cost of being a welfare chiseler (Lindbeck, 1995).

The notion of policy determinacy, which introduced this section, is an in-house problem in the social sciences and relates to broader ambiguities in the concept of efficiency in the economics of institutions (Furubotn, 1994). A world of incomplete information cannot be determinate: with variable and changing policy models, there is ample scope for new policy directions.

PUBLIC POLICY AND SOCIAL TRANSFORMATIONS

The new institutionalism has paid little attention to the role of policymakers and to the specification of policy instruments for institutional change. With few exceptions, the theory provides only implicit policy lessons. 14 The chapter concludes with a few thoughts about these implicit lessons for major structural transformations.

A General Theory?

Institutional analysis emphasizes that the creation of wealth depends in complex ways on institutions, and argues that institutions are rooted in political and social domains. Social science, however, is fragmented into insular disciplines and offers only partial, and often contentious, insights, rather than a reliable, comprehensive view of social systems. 15

Furthermore, major advances in social science need not provide policymakers with the means to orchestrate major structural changes. Successful transition requires a strategy that will overcome opposition, particularly when the short-term costs of structural adjustment are high, which they frequently will be. In a world of uncertainty, this is a formidable task (Dewatripont and Roland, 1995) . As social science evolves and provides better strategies for institutional change, it is also likely to supply the opponents of change with more sophisticated counter-policies. More knowledge can be a two-edged sword, unless conflict over structural policy involves primarily dispute over the effectiveness of different means to a common end.

Complexity, Learning, and Feedback

The current strength of the new institutionalism (and related fields, such as the new theory of the firm, industrial organization, and positive political economy) lies in partial or sectoral analysis where theory offers various policy insights. The policy implications include (1) methods for containing the control problem and thereby reducing misallocation and the waste of resources, and (2) measures to facilitate the revision of incomplete private and public policy models, thus allowing actors to reach their goals more effectively. The old theory of economic policy explicitly demanded the coordination of a set of policy instruments toward clear-cut goals, but many of the policy insights of institutional analysis are far less specific, particularly concerning the measures needed to restructure actors' information environments.

The literature on property rights, agency theory, asymmetric information, organization, and related topics contains a growing body of theories that examine how to structure control and align incentives with policy goals (Milgrom and Roberts, 1992; McMillan, 1995; Williamson, 1985).

At the macro level, the state contributes to low transaction costs and effective control structures in several ways:

By providing stable standards of measurement in exchange, including stable prices, and generally by creating a solid macroeconomic environment.

By credibly committing to honor ownership rights and avoid using state power to seize resources capriciously, and by following a stable and predictable policy of taxation (Weingast, 1993).

By protecting economic actors from each other through various means, including legal processes, and by facilitating (central) organizations that help establish reputation and detect fraud (Greif et al., 1994).

The extent to which private rules and private enforcement are able to substitute for an effective legal system and provide the necessary foundation for long-term economic growth is an unresolved issue. Recent empirical evidence shows that private actors often invent mechanisms for strengthening

control and lowering transaction costs when they encounter permissive regimes (China) or bureaucratic and inefficient states (Latin America) (McMillan, 1995; Stone et al., 1996). Although these private arrangements often appear to be quite effective, two types of uncertainty surround them: first, private arrangements may create forces that eventually challenge the political status quo, and thus it is uncertain how long permissive or bureaucratic regimes will tolerate unofficial control systems; and second, it is uncertain whether in the long run, private control systems are capable of supporting a modern, integrated national economy.

A final point about the design of control systems concerns the choice between centralization and decentralization. The direct links among measurement costs, incomplete data, and control, combined with the propensity of measurement costs to increase with distance, slant structural policy toward decentralization in structuring both economic organizations and public administration. As a result, concern with the limits of central control is a recurrent theme in the new institutional analysis (Ostrom et al., 1993).

The idea of incomplete and competing models has various implications for policy, although the literature is particularly weak in this area. This outlook weighs against attempts at great experiments or the rapid implementation of structural changes, and suggests modesty, incrementalism, and learning by doing. A new category of instruments emerges in a world of incomplete models: measures for changing the information environment and for creating incentives for actors to revise their models and make them more compatible with policy targets. In a closed society, for instance, policymakers can alter the information environment dramatically by opening the system and facilitating international contacts, such as trade, telecommunications, direct investments, and educational exchanges. Although such measures may have profound implications for structural change, the actual outcome in a dynamic environment is inherently uncertain and generally cannot be modeled in specific terms as a relation between instruments and targets in a Tinbergen policy world.

Various feedback mechanisms are crucial both for coping with incomplete and competing models and for directing outcomes in social processes. Makers of public policy can advance their aims if they are able to design, or facilitate, feedback mechanisms that inform or punish actors who operate with policy models, data, or even goals that are inconsistent with public policy. A properly structured competitive market provides effective feedback in terms of the aggregate wealth criterion. Public policy can push economic enterprises in the direction of more efficiency by fostering various forms of competition and by providing suitable institutions for structuring exit and entry. Empirical evidence from various parts of the world, including China, indicates that not only private firms, but also various hybrid forms of organization will operate relatively efficiently in a competitive environment (McMillan and Naughton, 1996).

The feedback from competition also constrains political units. If the members of agricultural cooperatives operated by local governments can easily exit and join more desirable cooperatives in other localities, poor management is likely to bring corrective feedback and compel local authorities to revise their policy models or targets. Similarly, free entry and exit discipline higher political units, such as the states of a federation, as Weingast (1995) has shown in his work on market-preserving federalism.

Policy Lags and Pathological Path Dependence

With incomplete models, there will be lengthy lags between when a policy is initiated and when relevant actors have the new structures right. During transitions, public and private actors need to experiment for some time once the fundamental incentives are in place before they are able to master the organizations of a modern market economy, including financial organizations, manufacturing firms, apolitical legal systems, and public administration. Few scholars doubted that structural change would involve substantial lags in learning and adjustment, but the institutional literature increasingly refers to far more dramatic lags, which are attributed to increasing returns and path dependence (Arthur, 1994; David, 1994; North, 1990). Several scholars have argued (1) that communities that share specific private policy models (and related informal institutions) resist public policy measures aimed at lowering transaction costs and increasing efficiency, and (2) that these models are extremely durable, enduring sometimes for centuries or even millennia.

In his study of modern reforms in regional administration in Italy, Putnam (1993) explains regional variations in the success of these reforms by variations in social capital and finds the roots of perverse policy models in the twelfth century. For Russia, Hedlund and Sundström (1996:32) trace perverse policy models to the Middle Ages and argue "that the future for Moscow represents a choice between a hierarchy dominated by strong, authoritarian 'organs', or a total breakdown of all organized societal functions."

This strong version of path dependence (which is still controversial) can be compared to the discovery of debilitating genes in specific human groups, and the implications for structural policy are devastating. The new institutionalism does not appear to propose any instruments or measures for manipulating models at this level, which indicates a new type of policy determinacy and calls for more research.

Alesina, A. 1991 Macroeconomics and politics. In Macroeconomics Annual , Stanley Fisher, ed. Cambridge, MA: National Bureau of Economic Research.

Alt, J.A., and K.A. Shepsle, eds. 1990 Perspectives on Positive Political Economy . Cambridge, England: Cambridge University Press.

Arthur, B.W. 1994 Increasing Returns and Path Dependence in the Economy . Ann Arbor: University of Michigan Press.

Åslund, A. 1995 How Russia Became a Market Economy . Washington, DC: Brookings Institution.

Barzel, Y. 1989 Economic Analysis of Property Rights . Cambridge, England: Cambridge University Press.

Bates, R.H. 1990 Macropolitical economy in the field of development. Pp. 31-54 in Perspectives on Positive Political Economy , J.A. Alt and K.A. Shepsle, eds. Cambridge, England: Cambridge University Press.

Bates, R.H., and B.R. Weingast 1995 Rationality and Interpretation: The Politics of Transition. Paper prepared for the annual meeting of the American Political Science Association, Chicago, August 31-September 3.

Benham, A., L. Benham, and M. Merithew 1995 Institutional Reforms in Central and Eastern Europe: Altering Paths with Incentives and Information . San Francisco: International Center for Economic Growth.

Bergson, A. 1938 A reformulation of certain aspects of welfare economics. Quarterly Journal of Economics 52(2):310-334.

Cheung, S.N.S. 1969 Transaction costs, risk aversion, and the choice of contractual arrangements. Journal of Law and Economics 12(1):23-42.

1975 Roofs or stars: The stated intents and actual effects of rent ordinance. Economic Inquiry 13:1-21.

1976 Rent control and housing reconstruction: the postwar experience of prewar premises in Hong Kong. Journal of Law and Economics 17(1) : 27-53 .

Coase, R.H. 1960 The problem of social cost. Journal of Law and Economics 3(1):1-44.

Cosmides, L., and J. Tooby 1994 Better than rational: Evolutionary psychology and the invisible hand. American Economic Review 84(2):327-332.

David, P.A. 1994 Why are institutions the "carriers of history"? Path dependence and the evolutions of conventions, organizations and institutions. Structural Change and Economic Dynamics 5(2):205-220.

Denzau, A.T., and D.C. North 1994 Shared mental models: Ideologies and institutions. Kyklos 47:3-31.

Dewatripont, M., and G. Roland 1995 The design of reform packages under uncertainty. American Economic Review 85(5): 1207-1223.

Diamond, P., and M. Rothschild, eds. 1989 Uncertainty in Economics . San Diego, CA: Academic Press.

Eggertsson, T. 1990 Economic Behavior and Institutions . Cambridge, England: Cambridge University Press.

1994 The economics of institutions in transition economies. In Institutional Change and the Public Sector in Transitional Economies , Salvatore Schiavo-Campo, ed. Washington, DC: The World Bank.

Friedman, M. 1961 The lag in effect of monetary policy. Journal of Political Economy . Reprinted in Milton Friedman: Critical Assessments. Vol. 1, J. Cunningham and R.N. Woods, eds. New York and London: Routledge.

Furubotn, E.G. 1994 Future Development of the New Institutional Economics: Extension of the Neoclassical Model or New Construct? Jena Lectures Vol. 1. Jena, Germany: Max-Planck Institute for Research into Economic Systems.

Furubotn, E.G., and R. Richter, eds. 1993 The new institutional economics: Recent progress, expanding frontiers. Journal of Institutional and Theoretical Economics (Special Issue) 149(1). Tubingen, Germany: JITE.

Greif, A., P. Milgrom, and B.R. Weingast 1994 Coordination, commitment, and enforcement: The case of the merchant guild. Journal of Political Economy 102(3):745-776.

Hansen, B. 1963 Lectures in Economic Theory: Part III: The Theory of Economic Policy . Cairo: United Arab Republic Institute of Planning.

Heckman, J.J. 1992 Haavelmo and the birth of modern econometrics: A review of The History of Econometric Ideas by Mary Morgan. Journal of Economic Literature 30(2):876-886.

Hedlund, S., and N. Sundström 1996 Does Palermo represent the future for Moscow? Journal of Public Policy 19(2): 113-156.

Hettich, W., and S.L. Winer 1993 Economic efficiency, political institutions and policy analysis. Kyklos 46(1):3-25.

Hirschman, A.O. 1981 The rise and decline of development economics. Pp. 1-24 in Essays in Trespassing: Economics to Politics and Beyond , A. Foxley, M.S. McPherson, and G. O'Donnell, eds. Cambridge, England: Cambridge University Press.

Hirshleifer, J., and J.G. Riley 1979 The analytics of uncertainty and information: An expository survey. Journal of Economic Literature 17(December):1375-1421.

1992 The Analytics of Uncertainty and Information . Cambridge, England: Cambridge University Press.

Kindleberger, C.P. 1958 Economic Development . New York: McGraw-Hill.

Kreps, D.M. 1990 A Course in Microeconomic Theory . New York: Harvester Wheatsheaf.

Lindbeck, A. 1994 Overshooting, reform and retreat of the welfare state. De Economist 104:1-19.

1995 Welfare state disincentives with endogenous habits and norms. Scandinavian Journal of Economics 97(4):477-494.

Lucas, R.E. 1976 Econometric policy evaluation: A critique. In Stabilization of the Domestic and International Economy , Karl Brunner and Allen H. Meltzer, eds. Amsterdam, The Netherlands: North Holland Publishing Co.

McKinnon, R. 1991 The Order of Economic Liberalization . Baltimore: Johns Hopkins University Press.

McMillan, J. 1995 Markets in Transition. Symposium address at the Seventh World Congress of the Econometric Society, August, Tokyo. Department of Economics, University of California at San Diego.

McMillan, J., and B. Naughton, eds. 1996 Reforming Asian Socialism: The Growth of Market Institutions . Ann Arbor: University of Michigan Press.

Milgrom, P., and J. Roberts 1992 Economics, Organization and Management . Englewood Cliffs, NJ: Prentice Hall.

Moe, T.M. 1990 Political institutions: The neglected side of the story. Journal of Law, Economics and Organization (Special Issue) 6:213-254.

Mueller, D.C. 1989 Public Choice II . Cambridge, England: Cambridge University Press.

Murrell, P. 1995 The transition according to Cambridge, Mass. Journal of Economic Literature 33(1):164-178.

North, D.C. 1990 Institutions, Institutional Change, and Economic Performance . Cambridge, England: Cambridge University Press.

1993 Institutions and credible commitment. Journal of Institutional and Theoretical Economics 149(1): 11-23.

1994 Economic performance through time. American Economic Review 84(3):359-368.

Ostrom, E. 1990 Governing the Commons . The Evolution of Institutions for Collective Action . Cambridge, England: Cambridge University Press.

Ostrom, E., L. Schroeder, and S. Wynne 1993 Institutional Incentives and Sustainable Development. Infrastructure Policies in Perspective . Boulder, CO: Westview Press.

Ostrom, E., R. Gardner, and J. Walker 1994 Rules Games, and Common Pool Resources . Ann Arbor: University of Michigan Press.

Posner, R.A. 1986 Economic Analysis of Law . Third Edition. Boston, MA: Little, Brown.

Putnam, R.D. 1993 Making Democracy Work: Civic Traditions in Modern Italy . Princeton, NJ: Princeton University Press.

Samuelson, P.A. 1947 Foundations of Economic Analysis . Cambridge, MA: Harvard University Press.

Simon, H.A. 1953 Causal ordering and identifiability. In Studies in Econometric Method , W.C. Hood and T.C. Koopmans, eds. New Haven, CT: Yale University Press. Reprinted in Herbert A. Simon (1957) Models of Man . New York: Garland Publishers.

Stiglitz, J.E. 1994 Whither Socialism . Cambridge, MA: The MIT Press.

Stone, A., B. Levy, and R. Paredes 1996 A comparative analysis of the legal and regulatory environment in Brazil and Chile. In Empirical Studies in Institutional Change , L. Alston, T. Eggertsson, and D. North, eds. Cambridge, England: Cambridge University Press.

Tinbergen, J. 1956 Economic Policy: Theory and Design . Amsterdam, The Netherlands: North Holland Publishing Co.

Tooby, J., and L. Cosmides 1992 The psychological foundation of culture. In The Adaptive Mind: Evolutionary Psychology and the Generation of Culture , J. Barkow, L. Cosmides and J. Tooby, eds. New York: Oxford University Press.

Weingast, B.R. 1993 Constitutions as governance structure: The political foundations of secure markets. Journal of Institutional and Theoretical Economics 149:286-311.

1995 The economic role of political institutions: Market preserving federalism. Journal of Law, Economics and Organization 7(1):1-31.

Werin, L., and H. Wijkander 1992 Contract Economics . Oxford, England: Blackwell.

Williamson, O.S. 1985 The Economic Institutions of Capitalism. Firms, Markets, Relational Contracting . Boston: The Free Press.

This ground-breaking new volume focuses on the interaction between political, social, and economic change in Central and Eastern Europe and the New Independent States. It includes a wide selection of analytic papers, thought-provoking essays by leading scholars in diverse fields, and an agenda for future research. It integrates work on the micro and macro levels of the economy and provides a broad overview of the transition process.

This volume broadens the current intellectual and policy debate concerning the historic transition now taking place from a narrow concern with purely economic factors to the dynamics of political and social change. It questions the assumption that the post-communist economies are all following the same path and that they will inevitably develop into replicas of economies in the advanced industrial West. It challenges accepted thinking and promotes the utilization of new methods and perspectives.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

New Economic Theory

  • Request new password
  • Search Search Please fill out this field.
  • 1. Managing CPRs

2. Behavioral Finance

3. asymmetric information, 4. game theory, 5. public choice theory.

  • Honorable Mention: Black-Scholes

The Bottom Line

5 nobel prize-winning economic theories you should know about.

Amy Fontinelle has more than 15 years of experience covering personal finance, corporate finance and investing.

new economic theories assignment

  • A Practical Guide to Microeconomics
  • Economists' Assumptions in their Economic Models
  • 5 Nobel Prize-Winning Economic Theories CURRENT ARTICLE
  • Understanding Positive vs. Normative Economics
  • What Factors Influence Competition in Microeconomics?
  • How Does Government Policy Impact Microeconomics?
  • Understanding Microeconomics vs. Macroeconomics
  • Differentiate Between Micro and Macro Economics
  • Microeconomics vs. Macroeconomics Investments
  • Introduction to Supply and Demand
  • Is Demand or Supply More Important to the Economy?
  • Law of Demand
  • Demand Curve
  • Law Of Supply
  • Supply Curve
  • Price Elasticity of Demand
  • Understanding Elasticity vs. Inelasticity of Demand
  • Factors Determining the Demand Elasticity of a Good
  • What Factors Influence a Change in Demand Elasticity?
  • What Is the Concept of Utility in Microeconomics?
  • What Is the Utility Function and How Is it Calculated?
  • Total Utility
  • Marginal Utility
  • Law Of Diminishing Marginal
  • What Does the Law of Diminishing Marginal Utility Explain?
  • Economic Equilibrium
  • Income Effect
  • Indifference Curve
  • Consumer Surplus
  • Comparative Advantage
  • Economies of Scale
  • Perfect Competition
  • Invisible Hand
  • Market Failure

The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel has been awarded 52 times to 86 Laureates who have researched and tested dozens of ground-breaking ideas. These five prize-winning economic theories are ideas you're likely to hear about in news stories because they apply to major aspects of our everyday lives.

Key Takeaways

  • Elinor Ostrom was awarded the prize in 2009 for her research and analysis of the economics of common-pool resources.
  • Daniel Kahneman's research on behavioral finance earned him the prize in 2002.
  • The Nobel Prize committee honored George A. Akerlof, A. Michael Spence, and Joseph E. Stiglitz in 2001 for their work on asymmetric information.
  • John C. Harsanyi, John F. Nash Jr., and Reinhard Selten received the prize in 1994 for research they conducted about the theory of non-cooperative games.
  • James M. Buchanan developed the theory of public choice for which he received the Nobel Prize in 1986.

1. Managing Common Pool Resources (CPRs)

The term common pool resources (CPRs) refers to those that aren't owned by one particular entity. They're held by the government or they're allocated to privately owned lots that are made available to the general public. CPRs, or commons as they're commonly known, are those that are available to everyone but are in finite supply. They include forests, waterways and water basins, and fishing grounds.

Ecologist Garrett Hardin wrote " The Tragedy of the Commons ," which appeared in Science in 1968. He addressed the overpopulation of the human race in relation to these resources. Hardin surmised that everyone would act in their own best interests and would end up consuming as much as they possibly could. This would make these resources even harder for others to find.

Indiana University political science professor Elinor Ostrom became the first woman to win the prize in 2009. She received it "for her analysis of economic governance, especially the commons."

Ostrom's Groundbreaking Research

Ostrom's research showed how groups work together to manage common resources such as water supplies, fish, lobster stocks, and pastures through collective property rights. She showed that Hardin's prevailing tragedy of the commons theory isn't the only possible outcome or even the most likely outcome when people share a common resource.

Ostrom showed that CPRs can be effectively managed collectively without government or private control as long as those who use the resource are physically close to it and have a relationship with each other.

Outsiders and government agencies don't understand local conditions or norms and they lack relationships with the community so they may manage common resources poorly. By contrast, insiders with a say in resource management will self-police to ensure that all participants follow the community's rules.

You can read about Ostrom's prize-winning research in her book, Governing the Commons: The Evolution of Institutions for Collective Action , and in her 1999 Science journal article, "Revisiting the Commons: Local Lessons, Global Challenges."

Behavioral finance is a form of behavioral economics. It studies the psychological influences and biases that affect the behavior and decisions of investors as well as financial professionals. These influences and biases tend to explain various market anomalies, especially those found in the stock market. This includes very drastic increases and drops in the price of securities.

Psychologist Daniel Kahneman was awarded the prize in 2002 "for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty."

Kahneman's Work

Kahneman showed that people do not always act out of rational self-interest as the economic theory of expected utility maximization would predict. This concept is crucial to behavioral finance. The research identified common cognitive biases that cause people to use faulty reasoning to make irrational decisions. These biases include the anchoring effect, the planning fallacy, and the illusion of control.

He conducted his research with Amos Tversky but Tversky wasn't eligible to receive the prize because he died in 1996.

Kahneman and Tversky's Theory

"Prospect Theory: An Analysis of Decision Under Risk," is one of the most frequently cited articles in economics journals. Kahneman and Tversky's award-winning prospect theory shows how people make decisions in uncertain situations.

They demonstrated that we tend to use irrational guidelines such as perceived fairness and loss aversion. They're based on emotions, attitudes, and memories, not logic. Kahneman and Tversky observed that we expend more effort just to save a few dollars on a small purchase than to save the same amount on a large purchase.

Kahneman and Tversky also showed that people use general rules such as representativeness to make judgments that contradict the laws of probability. When given the description of a woman concerned about discrimination and asked if she is more likely to be a bank teller or a bank teller who is a feminist activist, people tend to assume she is the latter even though probability laws tell us she is much more likely to be the former.

The Nobel Prize is not awarded posthumously.

The asymmetric information discipline is also known as information failure. It occurs when one party involved in an economic transaction has much more knowledge than the other. This phenomenon typically presents itself when the seller of a  good or service possesses greater knowledge than the buyer but the reverse dynamic may also be possible in some cases. Almost all economic transactions involve asymmetric information.

George A. Akerlof, A. Michael Spence, and Joseph E. Stiglitz won the prize "for their analyses of markets with asymmetric information" in 2001. The trio showed that economic models that are predicated on perfect information are often misguided because one party often has superior information in a transaction.

Understanding information asymmetry has improved our knowledge of how various markets work and the importance of corporate transparency. These concepts have become so widespread that we take them for granted but they were groundbreaking when they were first developed.

Akerlof, Spence, and Stiglitz's Research

Akerlof showed how information asymmetries in the used car market, where sellers know more than buyers about the quality of their vehicles, can create a market with lemons, a concept known as " adverse selection ". A key publication related to this prize is Akerlof's 1970 journal article, "The Market for 'Lemons': Quality Uncertainty and the Market Mechanism."

Spence's research focused on signaling or how better-informed market participants can transmit information to lesser-informed participants. He showed how job applicants can use educational attainment as a signal to prospective employers about their likely productivity and how corporations can signal their profitability to investors by issuing dividends.

Stiglitz showed how insurance companies can learn which customers present a greater risk of incurring high expenses. He called this process screening. Asymmetric information occurs by offering different combinations of deductibles and premiums , according to Stiglitz.

The theory of non-cooperative games is a branch of the analysis of strategic interaction commonly known as game theory . Non-cooperative games are those in which participants make non-binding agreements. Each participant bases his or her decisions on how he or she expects the other participants to behave without knowing how they will actually behave.

The academy awarded the 1994 prize to John C. Harsanyi, John F. Nash Jr., and Reinhard Selten "for their pioneering analysis of equilibria in the theory of non-cooperative games."

Harsanyi, Nash, and Selten's Analysis

One of Nash's major contributions was the Nash Equilibrium , a method for predicting the outcome of non-cooperative games based on equilibrium. Nash's 1950 doctoral dissertation, "Non-Cooperative Games," details his theory. The Nash Equilibrium expanded upon earlier research on two-player, zero-sum games .

Selten applied Nash's findings to dynamic strategic interactions and Harsanyi applied them to scenarios with incomplete information to help develop the field of information economics. Their contributions are widely used in economics, such as in the analysis of oligopoly and the theory of industrial organization. They've inspired fields of research.

This theory attempts to provide the rationale behind public decisions. It involves the participation of the general public, elected officials, and political committees, along with the bureaucracy that's set up by society. James M. Buchanan Jr. developed the public choice theory with Gordon Tullock.

James M. Buchanan Jr. received the prize in 1986 "for his development of the contractual and constitutional bases for the theory of economic and political decision-making."

Buchanan's Award-Winning Theory

Buchanan's major contributions to public choice theory bring together insights from political science and economics to explain how public-sector actors such as politicians and bureaucrats make decisions. Contrary to the conventional wisdom, he showed that:

  • Public sector actors behave in the public's best interest as public servants.
  • Politicians and bureaucrats tend to act in self-interest, the same way private sector actors like consumers and entrepreneurs do.

He described his theory as "politics without romance." Buchanan laid out his award-winning theory in a book he co-authored with Gordon Tullock in 1962, The Calculus of Consent: Logical Foundations of Constitutional Democracy .

We can get a better understanding of the incentives that motivate political actors and better predict the results of political decision-making using Buchanan's insights about the political process, human nature, and free markets . We can then design fixed rules that are more likely to lead to desirable outcomes.

Instead of allowing deficit spending which political leaders are motivated to engage in because each program the government funds earns politicians support from a group of voters, we can impose a constitutional restraint on government spending, which benefits the general public by limiting the tax burden.

Honorable Mention: Black-Scholes Theorem

Robert Merton and Myron Scholes won the 1997 Nobel Prize in economics for the Black-Scholes theorem , a key concept in modern financial theory that's commonly used for valuing European options and employee stock options.

The formula is complicated but investors can use an online options calculator to get its results by inputting an option's strike price, the underlying stock's price, the option's time to expiration, its volatility, and the market's risk-free interest rate. Fischer Black also contributed to the theorem but couldn't receive the prize because he passed away in 1995.

What Is the Anchoring Effect?

The Program on Negotiation at Harvard Law School describes the anchoring effect as a "cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered...when making decisions." That first piece of information is the "anchor."

What Is Another Component of Asymmetric Information?

Adverse selection and moral hazard are two common versions of asymmetric information. Both imply an unlevel playing field. One party is more knowledgeable about the subject at hand than the other. Adverse selection implies that one party has information that the other doesn't possess. Moral hazard is associated with one party taking risks in a transaction because they know they won't be held accountable financially or morally.

Who Decides Who Wins the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel?"

The Royal Swedish Academy of Sciences makes the final decision among nominees based on recommendations from the Economic Sciences Prize Committee. Nomination is by invitation only. The Committee then screens the candidates.

Each of the dozens of winners of the Nobel memorial prize in economics has made outstanding contributions to the field. The other award-winning theories are worth getting to know, too. Working knowledge of the theories described here will help you establish yourself as someone who is in touch with the economic concepts that are essential to our lives.

The Nobel Prize Organisation. " About the Prize ."

The Nobel Prize Organisation. “ Elinor Ostrom: Facts .”

Science.org. “ Revisiting the Commons: Local Lessons, Global Challenges .”

Cambridge University Press. “ Governing the Commons: The Evolution of Institutions for Collective Action .”

CFI Education. " Behavioral Finance ."

The Nobel Prize Organisation. “ The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel 2002 .”

The Nobel Prize Organisation. “ Daniel Kahneman—Biographical .”

The Nobel Prize Organisation. " Information for the Public ."

The Nobel Prize Organisation. " George A. Akerlof Article ."

The Nobel Prize Organisation. " Press Release 11 October 1994 ."

San Jose State University Economics Department. " Public Choice Theory ."

The Nobel Prize Organisation. “ Press Release 16 October 1986 ."

The Nobel Prize Organisation. " Press Release 14 October 1997 ."

Program on Negotiation Harvard Law School. " The Anchoring Effect and How It Can Impact Your Negotiation ."

Intelligent Economist. " Asymmetric Information ."

The Nobel Prize Organisation. " Nomination and Selection of Economic Sciences Laureates ."

new economic theories assignment

  • Terms of Service
  • Editorial Policy
  • Privacy Policy
  • Your Privacy Choices

Advertisement

Supported by

Why Are People So Down About the Economy? Theories Abound.

Things look strong on paper, but many Americans remain unconvinced. We asked economic officials, the woman who coined “vibecession” and Charlamagne Tha God what they think is happening.

  • Share full article

An illustration of a distorted mirrors reflecting the U.S. economy.

By Jeanna Smialek

The U.S. economy has been an enigma over the past few years. The job market is booming, and consumers are still spending, which is usually a sign of optimism. But if you ask Americans, many will tell you that they feel bad about the economy and are unhappy about President Biden’s economic record.

Call it the vibecession . Call it a mystery. Blame TikTok , media headlines or the long shadow of the pandemic. The gloom prevails. The University of Michigan consumer confidence index , which looked a little bit sunnier this year after a substantial slowdown in inflation over 2023, has again soured. And while a measure of sentiment produced by the Conference Board improved in May, the survey showed that expectations remained shaky.

The negativity could end up mattering in the 2024 presidential election. More than half of registered voters in six battleground states rated the economy as “poor” in a recent poll by The New York Times, The Philadelphia Inquirer and Siena College. And 14 percent said the political and economic system needed to be torn down entirely.

What’s going on here? We asked government officials and prominent analysts from the Federal Reserve, the White House, academia and the internet commentariat about what they think is happening. Here’s a summary of what they said.

Kyla Scanlon, coiner of the term ‘Vibecession’

Price levels matter, and people are also getting some facts wrong.

The most common explanation for why people feel bad about the economy — one that every person interviewed for this article brought up — is simple. Prices jumped a lot when inflation was really rapid in 2021 and 2022. Now they aren’t climbing as quickly, but people are left contending with the reality that rent, cheeseburgers, running shoes and day care all cost more.

“Inflation is a pressure cooker,” said Kyla Scanlon, who this week is releasing a book titled “ In This Economy? ” that explains common economic concepts. “It hurts over time. You had a couple of years of pretty high inflation, and people are really dealing with the aftermath of that.”

But Ms. Scanlon also pointed out that knowledge gaps could be part of the problem: A Harris poll for The Guardian this month found that a majority of Americans (incorrectly) believed that the United States was in a recession. About half said they believed the stock market was down from last year, though it is up considerably.

“Yes, there is economic frustration, but these are objectively verifiable facts,” she said.

Raphael Bostic, president of the Federal Reserve Bank of Atlanta

Part of this is about memory.

A big question is why — when the economy is growing, unemployment is historically low and stock prices are climbing — things feel so dim.

“When I talk to folks, they all tell me that they want interest rates to be lower, and they also tell me that prices are too high,” Raphael Bostic told reporters last week. “People remember where prices used to be, and they remember that they didn’t have to talk about inflation, and that was a very comfortable place.”

Mr. Bostic and his colleagues at the Fed have raised interest rates to a more-than-two-decade high in an effort to bring down the rapid price increases, and he said the key was wrestling inflation back to normal quickly.

Jared Bernstein, CHAIRMAN OF THE White House Council of Economic Advisers

Catching up with inflation takes time.

As inflation cools, there is some hope that the negativity could fade. Jared Bernstein noted that for the past 14 months, middle-class wage growth has been beating inflation, and predicted that people would feel better as wages caught up to higher price levels.

“If that were wrong, everyone would be walking around eternally upset that gas doesn’t cost $1 a gallon,” Mr. Bernstein said. “The two components of that adjustment are time plus rising real pay.”

Loretta Mester, President of the Cleveland Fed

Wages have lagged.

But not everyone has broken even at this point, and that could be part of the explanation behind the continued pessimism. On average, pay gains have not fully caught up with the jump in prices since the start of the pandemic, if you compare Consumer Price Index increases with a wages and salary measure that Fed officials watch closely.

“They still haven’t made up for all of the lost ground,” Loretta Mester said. “They’re still in a hole, a little bit.”

Ms. Mester noted that people were also struggling to afford houses, because prices have shot up in many places and high interest rates are making first-time homeownership difficult, putting that part of the American dream out of reach for many.

Lawrence H. Summers, Harvard economist and commentator

Interest rates are part of the issue.

That touches on an issue that Lawrence H. Summers recently raised in an economic paper : For most people, the higher interest rates that the Fed is using to try to slow demand and squash price increases feel like just another form of inflation. In fact, if high interest rates are added into inflation, that explains most of the gap between where consumer confidence is and where one might expect it to be.

“The experienced cost of living is much greater than inflation as reflected by the Consumer Price Index,” Mr. Summers said in an interview. He noted that consumer confidence improved when market-based rates, which feed into mortgage and leasing costs, eased early this year, then sank again as they rose.

Charlamagne Tha God, radio host

People remember more comfortable times.

Whatever is causing the unhappiness, it seems to be translating into negativity toward Mr. Biden. In the recent Times poll, many said they thought the economic and political system needed to be changed, and fewer said they thought that Mr. Biden, as opposed to former President Donald J. Trump, would usher in big alterations.

Charlamagne Tha God recently suggested on “ The Interview ,” a Times podcast, that Black voters in particular might be turning from Mr. Biden and toward Mr. Trump because they associated the former president with the last time they felt financially secure. Mr. Trump’s administration sent out two rounds of stimulus relief checks, which Mr. Trump signed. Mr. Biden sent out one, which he did not. And inflation began to pop in 2021, after Mr. Trump left office.

“People are living paycheck to paycheck,” Charlamagne said during a follow-up interview specifically about the economy. “You don’t know struggle until you’ve had to decide whether you’re going to pay for your car or pay for your rent.”

To his point, rents are up drastically since before the pandemic, and auto loan delinquencies are rising sharply. While inflation and higher interest rates have been a global phenomenon, people tend to blame the current economic challenges on whoever is in office.

“People can’t see past their bills,” Charlamagne said. “All we want is upward mobility and security, and whoever can provide that, even for a fleeting moment, you never forget it.”

Susan Collins, president of the Boston Fed

People are anxious postpandemic.

In fact, the recent economy has offered something of a split screen: Some people are doing really well, watching their retirement portfolios improve and their home prices appreciate. But those people were often already well off. Meanwhile, people carrying credit card balances are facing much higher rates, and many Americans have exhausted whatever savings they managed to amass during the pandemic.

“There are groups that are doing really, really, well, and there also are groups that are struggling,” Susan Collins said. “We talk to individuals who are having a lot of trouble making ends meet.”

But she also noted that the period since the pandemic had been wrought with uncertainty. Changes to interest rate policies, years of inflation, and headlines about war and geopolitical upheaval may have shaken how people view their economic situations.

“I think that there is a different level of anxiety postpandemic that is hard to rule out,” Ms. Collins said.

Aaron SOJOURNER, the W.E. Upjohn Institute

Some of this may be about media negativity.

Still, there’s one enduring mystery about the vibecession. People tend to be more optimistic about their personal economic situations than they are about the economy as a whole.

That could be because Americans rely on the media for their perception of national economic conditions, and news sentiment has grown more downbeat in recent years, said Aaron Sojourner, who recently wrote a study suggesting that economic news coverage has become more negative since 2018, and much more negative since 2021.

“For the last six years, the tone of economic news has been considerably more sour and negative than would be predicted based on macroeconomic variables,” he said.

But he acknowledged that journalists factored in real experiences and consumer sentiment data into their reporting, so it is difficult to know to what degree bad vibes are driving negative news and how much negative news is driving bad vibes.

“Does the sentiment cause the news, or does the news tone cause the sentiment? I don’t know,” Mr. Sojourner said.

Jeanna Smialek covers the Federal Reserve and the economy for The Times from Washington. More about Jeanna Smialek

RBA should now 'in theory' hike interest rates in June, but will it?

Analysis RBA should now 'in theory' hike interest rates in June, but will it?

Two  women walk past the Reserve Bank of Australia headquarters, with the building's name prominent in the background.

Inflation is rising again, and that's a problem for both the government and the Reserve Bank.

It's rising because some once heavily discounted items in the shops, former Treasury economist Warren Hogan says, aren't being so heavily discounted any more.

"One of the reason inflation's fallen in the past year is because goods prices have been coming off and, in fact falling in recent times."

"They've [now] jumped right back up."

Bread and cereal products, for example, rose 5.1 per cent in the year to April.

And fruit and vegetables rose 3.5 per cent.

At the same time, says Warren Hogan, the cost of services -- including rents, healthcare, banking and insurance, and education, remain elevated.

"So we're no longer talking about inflation that's falling back towards target."

"It's questionable whether it's even sticky.

"It looks like it could be picking up in 2024," Warren Hogan says.

Bumpy inflation data

It's important to note that the Bureau of Statistics' monthly inflation data is known to be a little bumpy.

However, crucially, the core, or underlying, measures of inflation published Wednesday also show inflation is not just stalling but picking up again.

The trimmed mean measure, for example, which looks at inflation without or the ups and downs, rose from 4.0 per cent to 4.1 per cent over the 12 months to April.

There's no doubt shoppers have tightened their purse strings.

It's clear, however, there's still plenty of money being pumped into the economy and this is lifting what economist and the Reserve Bank call "aggregate" or overall demand.

"The reality is there's a huge amount of investment spending going on: infrastructure boom outside of mining and across the easter seaboard; the need to build new dwellings."

"Demand in the economy, overall, is too strong.

"At the moment consumers are the ones copping all the pressure from the income tax burden, from interest rates going up, inflation, consumption is soft, but everything else is doing OK," Warren Hogan says.

Shadow Treasurer Angus Taylor described the ABS data as a "shocking set of numbers."

"The point here is the pain that every Australian household is feeling now."

"They are continuing to pay more for their groceries.

"They're continuing to pay more for their housing, whether they own it with a mortgage or they're renting," Mr Taylor said.

Battle to tame inflation

Rental price inflation rose 7.5 per cent in the year to April, down from 7.7 per cent in the year to March.

The Treasurer, Jim Chalmers, concedes the battle to tame inflation is not yet won.

Jim Chalmers looking serious behind a lectern.

"We know there is more work to do in the fight against inflation because it is still too high and people are under pressure and that's why the Budget had such a big focus on providing responsible cost-of-living relief."

But for the millions of mortgage borrowers across the country, especially those with large monthly repayments, what does this mean for the Reserve Bank and its monetary policy?

Warren Hogan thinks Wednesday's inflation data will push the RBA into hiking interest rates at its June meeting.

"The higher inflation result today I think will tip them over the edge and I think there's a very good chance we'll see a rate hike in June."

"The inflation picture is deteriorating, not getting better.

"Look I think [another RBA interest rate hike] will just take some demand out of the economy.

"It does raise an interesting and important question: whether one rate hike is enough to actually be meaningful?

"It'll obviously do a lot of damage to households that have got large mortgages and significant mortgage repayments.

"There is a broader strength of demand.

"Maybe there needs to be two more rate hikes," Warren Hogan said.

AMP's deputy chief economist Diana Mousina disagrees and has held onto her view the Reserve Bank will cut interest rates in November.

Why? Well she doesn't think today's inflation numbers will surprise the RBA.

"At their last meeting they said that unless inflation was moving out of its forecast that it was happy to keep interest rates where they were."

A woman with long brown hair, wearing a cream jacket over a brown top, has her arms folded and smiles to the camera.

"And the April inflation data was basically in line with the Reserve Bank's forecast.

"They think that inflation's going to be running at 3.8 per cent," Diana Mousina said.

Diana Mousina says it would be difficult to justify an RBA interest rate hike amid weak economic growth.

"The unemployment rate went up more than expected."

"[There's] very weak retail data, and the GDP result for next week is going to look pretty soft," she said.

Interest rate debate

Perhaps this comment from senior Marcus Today markets analysts, Henry Jennings, provides some sort of synthesises for the interest rate debate.

"The interest rates will stay where they are for a long while yet," he noted.

"In theory, rates should rise, but anyone who has stepped out from Martin Place into the real world will know that many are doing it really tough.

"A rate increase would be 'courageous Minister' as Sir Humphrey would say.

"Not going to happen."

The Australian stock market fell sharply immediately following the release of the inflation data as investors pushed back their expectations of an RBA interest rate cut in 2024.

The Reserve Bank has also been at pains to emphasise its laser-like focus on inflation expectations.

Put simply, the bank says, the longer inflation stays elevated the harder it is to ultimately get back down.

Inflation becomes entrenched.

So then, what will it take for the inflation rate to get back to the Reserve Bank's target range of two to three per cent?

Well economists expect headline, or short-term, inflation to fall sharply when federal and state government electricity rebates come into effect from July.

But that's also ahead of billions of extra dollars in state and federal government stimulus later this year and the potential for a Fair Work increase in the minimum wage.

The inflation battle appears to be stuck in a tug-of-war.

  • X (formerly Twitter)
  • Business, Economics and Finance

IMAGES

  1. (PDF) New Economic Theories

    new economic theories assignment

  2. New Economic Policy Assignment Help Service

    new economic theories assignment

  3. PPT

    new economic theories assignment

  4. PPT

    new economic theories assignment

  5. Assignment 2

    new economic theories assignment

  6. New Economic Theories

    new economic theories assignment

VIDEO

  1. ECONOMIC GROWTH AND DEVELOPMENT ASSIGNMENT 1 WEEK 1 NPTEL

  2. ECONOMIC GROWTH AND DEVELOPMENT ASSIGNMENT 8 WEEK 8 NPTEL/SWAYAM

  3. PSYCH 01 Developmental Theories Assignment

  4. Principles of Economic ||Week-3 Assignment Answer || Nptel 2023

  5. (Un)Learn Economics

  6. New classical theories of economic Growth:Solow Model

COMMENTS

  1. new economic theories assignment Flashcards

    Get a hint. Every individual is continually exerting himself to find out the most advantageous employment for whatever capital he can command. It is his own advantage, indeed, and not that of the society which he has in view. But the study of his own advantage naturally leads him to prefer that employment which is most advantageous to the society.

  2. New Economic Theories Assignment Flashcards

    Engels was a communist who wanted to inspire workers to revolt against the upper classes. Therefore, he might have written descriptions of workers' lives that made them seem worse than they actually were. Study with Quizlet and memorize flashcards containing terms like According to Smith, what is the primary economic motivation for most people ...

  3. 21st century crises demand new economic understanding, say top

    Leading economists, including Nobel laureate Joseph Stiglitz, Argentina's Minister of Economy Martin Guzman, as well as academics from Oxford, Yale, Columbia, and UCLA, are calling today for a deep shift in how economists understand the overall economy. According to the new thinking, a series of massive economic shocks have left traditional economic theory in pieces and the

  4. A New Economic Paradigm : Democracy Journal

    Thomas Kuhn. In the fall of 2018, University of California, Berkeley economist Emmanuel Saez said, to an audience of economists, policymakers, and the press, "If the data don't fit the theory, change the theory.". He was speaking about a new data set he developed to show who gains from economic growth, the rise in monopoly and monopsony ...

  5. Introduction to Economic Theories

    The four theories that I like to introduce you to are Social Economics, Institutional Economics, Post Keynesian economics and, at the very end of each topic, Neoclassical Economics, for the special case of ideally functioning markets. But not everything is different in this course. Like every economics course, it includes numbers, diagrams ...

  6. Social Factors in the Economy: New Economic Sociology and ...

    The reinvention of new economic sociology was deeply inspired by criticisms of standard economic and sociological theory for using action models such as Homo oeconomicus or Homo sociologicus, ignoring cognitive aspects and the social constitution of intentions on the one hand, or decision-making processes on the other.Since then, researchers from Europe and the US restarted working on action ...

  7. Introduction to the New Paradigm of Political Economic Theory

    1. The Background to Basic Theory and its Roots in laissez-faire. Economic Theory is a disputed field of intellectual endeavor. The stakes implicated in economic theory development are high and as a consequence theory is a contested domain. The contestation is intensified because the dominance of a particular theory will influence the social ...

  8. PDF ECONOMIC DEVELOPMENT: THEORY, EVIDENCE AND POLICY DESIGN DEV-101 First

    DEV-101 is a semester-long course that evaluates theories of economic (under)development and scrutinizes ... The assignments and exams will test understanding of concepts taught in pre-class videos, class lectures ... The New Development Economics: We Shall Experiment, but How Shall we Learn?" in J. Cohen and W. Easterly, eds., ...

  9. (PDF) The "New" Economic Theories

    The first is to study the links between the "new" economic. theories, this is, the "new" trade theory, the "new" growth theory and the "new" economic. geography. These are three ...

  10. New Economic Theories Flashcards

    What an economic theory that stresses government management of the production and distribution of goods called? Socialism. Who did Karl Marx want to control government and develop a classless society? Workers. Study with Quizlet and memorize flashcards containing terms like What did Adam Smith believe?, What two principles form the basis for ...

  11. PDF Economics 605: ADVANCED MICROECONOMIC THEORY

    Failure to acknowledge assistance on an assignment, or to cite a ... Iyer, Sriya. "The New Economics of Religion." Journal of Economic Literature 54.2 ... William, 1999, "The young person's guide to writing economic theory," Journal of Economic Literature 37(1): 157-183. II. RISK & UNCERTAINTY George, Akerlof. "The market for 'Lemons ...

  12. Economic Theory and Mathematical Models

    Browse In Economic Theory and Mathematical Models | Oxford Research Encyclopedia of Economics and Finance ... Given this situation, a radically new approach, called the σ -field approach, was developed in the mid-1980s for modeling land in a general equilibrium framework. ... Traditional neighborhood-based assignment is being replaced by ...

  13. New Economic Theories Aren't Always Better

    The Bank of England against higher wages. Paul Krugman has been an Opinion columnist since 2000 and is also a distinguished professor at the City University of New York Graduate Center. He won the ...

  14. New Economic Theories

    This paper analyses some of the most important spillovers of recent developments of economic theory into environmental economics. Attention is given to the anlaysis of sustainable economic development paths, where endogenous growth models are used; the implications of environmental dumping and more generally of policies concerning global environmental issues, where new trade theories are very ...

  15. Rethinking the Theory of Economic Policy: Some Implications of the New

    In a perceptive discussion of the theory of economic policy, Hansen ( 1963) emphasizes the central role of models in policy formulation. As almost all policy aims at influencing economic outcomes or processes, policymakers—politicians, administrators, social scientists, voters, or rulers—must rely on a model, or a description of the economic system, which sometimes is little more than a ...

  16. New Economic Theories Flashcards

    Study with Quizlet and memorize flashcards containing terms like Capitalism, Adam Smith, laissez-faire and more.

  17. New Economic Theory

    A change in thinking can lead to a radical change in action. This is the rationale for the project on New Economic Theory (NET) initiated by the World Academy of Art & Science and World University Consortium in collaboration with more than a dozen leading institutions for the constitution of the NET Working Group. The objective is to harness ...

  18. Economics of Money and Banking

    The financial crisis of 2007-2009 is a wakeup call that we need a similar evolution in the analytical apparatus and theories that we use to understand that system. Produced and sponsored by the Institute for New Economic Thinking, this course is an attempt to begin the process of new economic thinking by reviving and updating some forgotten ...

  19. PDF The "New" Economic Theories

    THE "NEW" ECONOMIC THEORIES*. HELENA MARQUES. Department of Economics, Claremont Tower University of Newcastle upon Tyne Newcastle upon Tyne NE1 7RU United Kingdom email: [email protected]. ABSTRACT. This paper has two main goals. The first is to study the links between the "new" economic theories, this is, the "new" trade ...

  20. 5 Nobel Prize-Winning Economic Theories You Should Know About

    John C. Harsanyi, John F. Nash Jr., and Reinhard Selten received the prize in 1994 for research they conducted about the theory of non-cooperative games. James M. Buchanan developed the theory of ...

  21. Why Are People So Down About the Economy? Theories Abound.

    May 30, 2024. The U.S. economy has been an enigma over the past few years. The job market is booming, and consumers are still spending, which is usually a sign of optimism. But if you ask ...

  22. Flashcards new economic theories assignment

    new economic theories assignment. Log in. Sign up. Get a hint. Every individual is continually exerting himself to find out the most advantageous employment for whatever capital he can command. It is his own advantage, indeed, and not that of the society which he has in view. But the study of his own advantage naturally leads him to prefer that ...

  23. Match: New Economic Theories Assignment

    Match all the terms with their definitions as fast as you can. Avoid wrong matches, they add extra time!

  24. RBA should now 'in theory' hike interest rates in June, but will it?

    Warren Hogan thinks Wednesday's inflation data will push the RBA into hiking interest rates at its June meeting. "The higher inflation result today I think will tip them over the edge and I think ...