Grants Database

The Foundation awards approximately 200 grants per year (excluding the Sloan Research Fellowships), totaling roughly $80 million dollars in annual commitments in support of research and education in science, technology, engineering, mathematics, and economics. This database contains grants for currently operating programs going back to 2008. For grants from prior years and for now-completed programs, see the annual reports section of this website.

Grants Database

Grantee
Amount
City
Year
  • grantee: University of Michigan
    amount: $20,000
    city: Ann Arbor, MI
    year: 2017

    To support a symposium for researchers, policymakers, and financial experts that will highlight interactions between behavioral economics and macroeconomics

    • Program Research
    • Sub-program Economics
    • Investigator Michael Barr

    To support a symposium for researchers, policymakers, and financial experts that will highlight interactions between behavioral economics and macroeconomics

    More
  • grantee: Tufts University
    amount: $20,000
    city: Medford, MA
    year: 2017

    To support a mathematical workshop on the Geometry of Redistricting

    • Program Research
    • Sub-program Economics
    • Investigator Moon Duchin

    To support a mathematical workshop on the Geometry of Redistricting

    More
  • grantee: Harvard University
    amount: $20,000
    city: Cambridge, MA
    year: 2017

    To support the Third Annual Conference on Big Data at the Harvard Center for Mathematical Sciences and Applications

    • Program Research
    • Sub-program Economics
    • Investigator Shing-Tung Yau

    To support the Third Annual Conference on Big Data at the Harvard Center for Mathematical Sciences and Applications

    More
  • grantee: Massachusetts Institute of Technology
    amount: $502,129
    city: Cambridge, MA
    year: 2017

    To explore the effects of robots on employment, wages, and productivity

    • Program
    • Sub-program Economics
    • Investigator Daron Acemoglu

    This grant funds work by economists Daron Acemoglu of MIT and Pascual Restrepo of Boston University, who are investigating the economics of robotics and automation. These two researchers have begun developing a conceptual framework to understand how robotics is affecting the economy. The effects of new automation technologies, they maintain, can best be understood by explicitly examining how fast and how thoroughly they replace human labor in the performance of specific tasks. One virtue of such a framework is that it helps distinguish between the “displacement effect” of automation—the way it can reduce demand for certain kinds of labor—and the “productivity effect” of automation—the way it can increase the value of certain sorts of labor by making laborers more productive. Using this framework, Acemoglu and Restrepo estimate that an increase of one new robot per thousand workers in the U.S. economy reduces the ratio of employment to population by 0.5 percentage points and reduces average wages by 1 percent in a local labor market with the average U.S. exposure to robots relative to a local labor market with no exposure to robots. Grant funds will support the extension and refinement of Acemoglu and Restrepo’s work, including plans to disaggregate effects across various labor markets by studying long-term and fine-grain data at the firm level. The project promises to generate at least six academic papers based on this work.

    To explore the effects of robots on employment, wages, and productivity

    More
  • grantee: Columbia University
    amount: $490,298
    city: New York, NY
    year: 2017

    To integrate behavioral insights into the foundations of standard macroeconomic models by re-examining the role of the Euler equation

    • Program Research
    • Sub-program Economics
    • Investigator Emi Nakamura

    How do people decide between consuming more today and saving more for the future? Mainstream macroeconomists have one answer: the Euler Equation. Simply put, it says that an optimizing agent will consume today up to the point where adding one more unit now would provide the same utility that could be expected if consuming that extra unit were deferred until tomorrow instead. In principle, the attitude expressed by the Euler Equation seems reasonable enough. Surely if you knew that having a second dessert right now, for example, would not be as enjoyable as having that dessert tomorrow, you would do well to wait. Yet as a practical matter, such calculations are difficult or impossible for individuals to make. And we all know from experience that hardly anyone ever tries. Real people rely on heuristics at best, and are sometimes not only inconsistent but also self-defeating. The Euler Equation also has theoretical implications that limit its applicability to the real world. For instance, to a population governed by the Euler Equation, the timing of consumption does not depend on when income arrives. So Euler populations will not alter their behavior in response income events like tax cuts. But clearly people in the real world do so alter their behavior. This grant funds the research of Emi Nakamura and Jon Steinsson of Columbia University to test alternatives to the Euler Equation against a uniquely comprehensive dataset of the consumer behavior of the residents of Iceland, which has for years usefully kept records of nearly every financial transaction in the country. The goal is to devise a replacement for the Euler Equation and to create new macroeconomic models that are both less naпve and more useful in predicting consumer behavior in the real world.

    To integrate behavioral insights into the foundations of standard macroeconomic models by re-examining the role of the Euler equation

    More
  • grantee: Brookings Institution
    amount: $632,355
    city: Washington, DC
    year: 2017

    To promote independent, unbiased, and nonpartisan economic research on regulatory economics

    • Program Research
    • Sub-program Economics
    • Investigator Adam Looney

    Effective government regulations can improve citizens’ health, safety, and financial well-being and reduce market imperfections. On the other hand, regulations that are poorly designed or implemented can impair markets, impose burdens, and impede innovation. There are potential benefits from regulatory interventions that mitigate imperfections but also potential costs from necessarily imperfect regulation. The challenge is to find an appropriate balance. This grant provides support for an initiative at the Brookings Institution to found a new, evidence-based, non-ideological Center on Regulation and Markets. In recent years, the trend has been for academics interested in regulation to specialize in environmental, health, labor, or other specific regulatory contexts. While this approach has many merits, such specialization deprives the field of the insights and wisdom that come from the wider study of regulation as such. The new Brookings Center will aim to recapture those insights and revitalize regulatory economics by incorporating recent behavioral, technological, societal, and legal perspectives. The new Center will initially concentrate on three work streams: Regulatory Processes and Perspectives, Market and Government Failures, and the Regulation of Financial Markets. Specific topics range from autonomous vehicles and the sharing economy to bankruptcy law and cost/benefit estimation methods. Outputs will include peer-reviewed papers, policy briefs, roundtables, and conferences.

    To promote independent, unbiased, and nonpartisan economic research on regulatory economics

    More
  • grantee: Private Capital Research Institute
    amount: $500,000
    city: Boston, MA
    year: 2017

    To set up an Administrative Data Research Facility that makes data about the private capital industry accessible to researchers

    • Program Research
    • Sub-program Economics
    • Investigator Josh Lerner

    Representing roughly $4 trillion globally, private capital plays an outsize role in productivity trends since its investments traditionally promote innovation and reorganization. Since private equity is private, however, there is very little available data on how venture capital or private equity firms invest in companies. What few studies we do have comes from proprietary data that cannot be shared and thus cannot be subjected to normal scientific attempts to replicate or check findings and results. Josh Lerner, a distinguished scholar at the Harvard Business School, is so keen on making data about this sector more available to academic researchers that he established a nonprofit, the Private Capital Research Institute (PCRI), explicitly for that purpose. This grant funds a project by Lerner and his team at PCRI to compile a large dataset of Certificates of Incorporation (COIs). COIs filings record significant details about the provision of private capital, including information on the capital structure and key terms of venture capital deals along with important information about valuation. Though supposedly public, COIs are in practice quite difficult to obtain or study other than one at a time. Lerner and the PCRI staff will use grant funds acquire approximately 6,000 COIs and begin compile a database that tracks 20-30 variables contained in COIs. The database will then be made available to academic researchers for research.

    To set up an Administrative Data Research Facility that makes data about the private capital industry accessible to researchers

    More
  • grantee: Urban Institute
    amount: $616,926
    city: Washington, DC
    year: 2017

    To demonstrate new statistical and visualization capabilities by migrating massive microsimulation models to the cloud

    • Program Research
    • Sub-program Economics
    • Investigator Robert McClelland

    Evaluating the impact of proposed changes to the law requires predicting how people’s behavior will change in response to this or that policy change. These predications are made using microsimulations. Researchers compile data from a representative sample of the population, run models that estimate what those individuals will do in response to changes in, say, the tax code, and then aggregate the results. This is a traditional tool not just for economists but also for the study of traffic, finance, epidemiology, and crowds. The problem with microsimulations, however, is that they are computationally unwieldy. Running a sophisticated model requires lots of time and computing power. Funds from this grant support efforts by Robert McClelland at the Urban Institute’s Tax Policy Center (TPC) to take the next big step in microsimulation by harnessing the power of cloud-based computing. McClelland will move the TPC’s existing tax policy evaluation microsimulation models to the cloud, allowing the models to both be run faster and to allow multiple simulations to be run at once. This will make it routinely practical, for example, to see how robust results are to changes in parameter choices, to evaluate many different policy options and see which works best, and to handle nonlinearities due to thresholds in the tax code where different rules kick in or out. Basic statistical tasks—like obtaining variances, building confidence intervals, or testing hypotheses—should run in a matter of hours rather than months. These new capabilities will greatly enhance how useful TPC’s models are for rapidly understanding proposed changes in the tax code. The TPC team will then test these new capabilities by investigating three specific research questions: How does uncertainty in growth rates and recession timing affect projected tax revenues? How does sampling variation affect model behavior? And how can tax policies improve distributional outcomes without reducing revenue? Lastly, TPC will also launch an interactive website where the public can explore and visualize tax plans of their own design in real time.

    To demonstrate new statistical and visualization capabilities by migrating massive microsimulation models to the cloud

    More
  • grantee: Stanford University
    amount: $480,854
    city: Stanford, CA
    year: 2017

    To develop, test, and post new algorithms for estimating heterogeneous causal effects from large-scale observational studies and field experiments

    • Program Research
    • Sub-program Economics
    • Investigator Susan Athey

    This work funds methodological work by economist Susan Athey, who is aiming to develop rigorous new statistical algorithms that will allow machine learning programs to isolate causal relationships in large, complex datasets. Athey is building special new tools to handle methodological tasks that economists care about but often find challenging. These include novel techniques for taking heterogeneity into account while estimating treatment effects, calculating optimal policies, and testing hypotheses in very large and varied populations. Athey’s focus will be on computing algorithms that are particularly useful for evaluating policy interventions and that enable one to isolate how policy changes differentially affect the behavior of heterogeneous populations. As a result of her work, she expects to publish several pieces in peer reviewed statistical and econometric journals and all the algorithms, code, documentation, and nonproprietary data Athey and her team generates will be made freely available to other researchers.

    To develop, test, and post new algorithms for estimating heterogeneous causal effects from large-scale observational studies and field experiments

    More
  • grantee: NumFOCUS
    amount: $684,185
    city: Austin, TX
    year: 2017

    To develop a programming toolkit for the construction, execution, and evaluation of macroeconomic simulations where heterogeneous agents interact behaviorally

    • Program Research
    • Sub-program Economics
    • Investigator Christopher Carroll

    Though it has been ten years since the Great Recession, the comprehensive macroeconomic models in use at central banks, government agencies, and other large financial institutions are not noticeably improved from a decade ago. Conversations with leaders of those institutions point to two fundamental flaws in traditional models, namely, the assumptions about representative agents and about rational expectations. These imply not only that the economy evolves as if there is only one consumer and only one firm but also that the consumer and the firm make optimal decisions based on predictions that are realized. Why are macroeconomists so reluctant to give up these stultifying assumptions? Because as hard as it is to run models with those assumptions, it is nearly impossible to compute much without them. Chris Carroll of Johns Hopkins University wants to fix this situation. While serving as chief economist at the Consumer Financial Protection Bureau (CFPB), he started constructing an open source computational tool kit for macroeconomists that can specifically handle non-rational heterogeneous agents. The platform, the Heterogeneous Agents Resources Kit (HARK), is capable of modeling how microeconomic interactions among heterogeneous agents can lead to macroeconomic outcomes different from those predicted by traditional techniques. It is also possible to assign less-than-rational behaviors—such as hyperbolic discounting, anchoring, or herding—to parts of the population. Running simulations under those circumstances can reveal phenomena that traditional models can neither explain nor even generate. This grant provides three years of support to Carroll as he further expands and develops HARK and creates tools to facilitate its use.

    To develop a programming toolkit for the construction, execution, and evaluation of macroeconomic simulations where heterogeneous agents interact behaviorally

    More
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website.