Grants Database

The Foundation awards approximately 200 grants per year (excluding the Sloan Research Fellowships), totaling roughly $80 million dollars in annual commitments in support of research and education in science, technology, engineering, mathematics, and economics. This database contains grants for currently operating programs going back to 2008. For grants from prior years and for now-completed programs, see the annual reports section of this website.

Grants Database

Grantee
Amount
City
Year
  • grantee: The University of Chicago
    amount: $750,000
    city: Chicago, IL
    year: 2017

    To study how the choice of computational tools such as programming languages and data-analysis environments impacts their users

    • Program Technology
    • Sub-program Data & Computational Research
    • Investigator James Evans

    In linguistics, the Sapir-Whorf hypothesis holds that the structure of a language affects its speakers’ world view and modes of thought. University of Chicago computational sociologist James Evans and University of Wisconsin cognitive scientist Gary Lupyan hypothesize that a version of this hypothesis applies to programming languages. They propose to explore the “cognitive and social consequences of programming and data analysis environment” choices, specifically how the characteristics of programming languages might influence a developer’s efficiency, creativity, and collaboration. To evaluate this hypothesis, Evans and Lupyan will undertake exploratory studies of observational data on software development broadly then then look more closely at specific cases in scientific software development. They will use large-scale project data from GitHub to determine which specific features of programming languages (e.g., static vs. dynamic variable typing) might be best operationalized as independent variables that influence the ways in which developers think and work. They will then test the hypotheses that surface through that exploratory work using a series of comparative-language experiments to be run in constrained development environments, including the Jupyter Notebook platform. Grant funds provide three years of research support for the project.

    To study how the choice of computational tools such as programming languages and data-analysis environments impacts their users

    More
  • grantee: Rochester Institute of Technology
    amount: $470,458
    city: Rochester, NY
    year: 2017

    To develop a mathematically-aware search engine for popular use by both students and experts

    • Program Technology
    • Sub-program Scholarly Communication
    • Investigator Richard Zanibbi

    A “math aware” search engine is exactly what it sounds like, a search engine that speaks and understands the language of mathematics. I would be able to locate not only words on pages, but to identify and recognize mathematical symbols, expressions, equations, formulas, and theorems. This is harder than it sounds, since common mathematical symbols can take on special meanings depending on the context in which they appear. This grant funds work by computer scientists Richard Zanibbi and Lee Giles to create an easy to use, fully math aware search engine. Zanibbi and Giles plan the develop state-of-the-art methods for extracting, indexing and retrieving math in documents; develop algorithms for the recognition of handwritten math and math captured in images; and implement these in a user-friendly interface with helpful features like autocompletion of common queries. The new engine will then be tested on Wikipedia and on CiteSeerX, an open-source repository of academic papers. The completed search engine, if successful, would vastly expand the possibilities of discovered for amateur and professional mathematicians alike, with numerous applications in both research and education.

    To develop a mathematically-aware search engine for popular use by both students and experts

    More
  • grantee: Association of Research Libraries
    amount: $315,100
    city: Washington, DC
    year: 2017

    To develop and disseminate a Code of Best Practices in Fair Use for Software Preservation

    • Program Technology
    • Sub-program Scholarly Communication
    • Investigator Krista Cox

    This grant funds an initiative by the Association of Research Libraries to document and clarify copyright and intellectual property law issues related to the archiving of software. Led by intellectual property lawyer, Peter Jaszi, the initiative has three parts. First, Jaszi and a team of collaborators will undertake a broad literature review and conduct some 40 long-form interviews with legal experts, librarians, museum curators, software developers and other stakeholders to produce “a report on problems that arise in software preservation regarding issues of copyright and fair use.” The report will then become the basis for a set of small workshops to generate, after legal review, a code of reasonable best practices used by archivists to resolve those problems. Finally, a substantial outreach push will build community consensus in support of those best practices. The work will be stewarded by the Association of Research Libraries, whose membership has a strong interest in this area, but will also draw heavily on the museum community, as well as major professional organizations in computer science, and other computationally intensive disciplines. The effort promises the legal state-of-play surrounding several thorny intellectual property issues related to software archiving, promote better archival practices across the country and further the cause of reproducibility in research, which depends on the continued availability of software used to generate scientific results.

    To develop and disseminate a Code of Best Practices in Fair Use for Software Preservation

    More
  • grantee: Columbia University
    amount: $490,298
    city: New York, NY
    year: 2017

    To integrate behavioral insights into the foundations of standard macroeconomic models by re-examining the role of the Euler equation

    • Program Research
    • Sub-program Economics
    • Investigator Emi Nakamura

    How do people decide between consuming more today and saving more for the future? Mainstream macroeconomists have one answer: the Euler Equation. Simply put, it says that an optimizing agent will consume today up to the point where adding one more unit now would provide the same utility that could be expected if consuming that extra unit were deferred until tomorrow instead. In principle, the attitude expressed by the Euler Equation seems reasonable enough. Surely if you knew that having a second dessert right now, for example, would not be as enjoyable as having that dessert tomorrow, you would do well to wait. Yet as a practical matter, such calculations are difficult or impossible for individuals to make. And we all know from experience that hardly anyone ever tries. Real people rely on heuristics at best, and are sometimes not only inconsistent but also self-defeating. The Euler Equation also has theoretical implications that limit its applicability to the real world. For instance, to a population governed by the Euler Equation, the timing of consumption does not depend on when income arrives. So Euler populations will not alter their behavior in response income events like tax cuts. But clearly people in the real world do so alter their behavior. This grant funds the research of Emi Nakamura and Jon Steinsson of Columbia University to test alternatives to the Euler Equation against a uniquely comprehensive dataset of the consumer behavior of the residents of Iceland, which has for years usefully kept records of nearly every financial transaction in the country. The goal is to devise a replacement for the Euler Equation and to create new macroeconomic models that are both less naпve and more useful in predicting consumer behavior in the real world.

    To integrate behavioral insights into the foundations of standard macroeconomic models by re-examining the role of the Euler equation

    More
  • grantee: Brookings Institution
    amount: $632,355
    city: Washington, DC
    year: 2017

    To promote independent, unbiased, and nonpartisan economic research on regulatory economics

    • Program Research
    • Sub-program Economics
    • Investigator Adam Looney

    Effective government regulations can improve citizens’ health, safety, and financial well-being and reduce market imperfections. On the other hand, regulations that are poorly designed or implemented can impair markets, impose burdens, and impede innovation. There are potential benefits from regulatory interventions that mitigate imperfections but also potential costs from necessarily imperfect regulation. The challenge is to find an appropriate balance. This grant provides support for an initiative at the Brookings Institution to found a new, evidence-based, non-ideological Center on Regulation and Markets. In recent years, the trend has been for academics interested in regulation to specialize in environmental, health, labor, or other specific regulatory contexts. While this approach has many merits, such specialization deprives the field of the insights and wisdom that come from the wider study of regulation as such. The new Brookings Center will aim to recapture those insights and revitalize regulatory economics by incorporating recent behavioral, technological, societal, and legal perspectives. The new Center will initially concentrate on three work streams: Regulatory Processes and Perspectives, Market and Government Failures, and the Regulation of Financial Markets. Specific topics range from autonomous vehicles and the sharing economy to bankruptcy law and cost/benefit estimation methods. Outputs will include peer-reviewed papers, policy briefs, roundtables, and conferences.

    To promote independent, unbiased, and nonpartisan economic research on regulatory economics

    More
  • grantee: Private Capital Research Institute
    amount: $500,000
    city: Boston, MA
    year: 2017

    To set up an Administrative Data Research Facility that makes data about the private capital industry accessible to researchers

    • Program Research
    • Sub-program Economics
    • Investigator Josh Lerner

    Representing roughly $4 trillion globally, private capital plays an outsize role in productivity trends since its investments traditionally promote innovation and reorganization. Since private equity is private, however, there is very little available data on how venture capital or private equity firms invest in companies. What few studies we do have comes from proprietary data that cannot be shared and thus cannot be subjected to normal scientific attempts to replicate or check findings and results. Josh Lerner, a distinguished scholar at the Harvard Business School, is so keen on making data about this sector more available to academic researchers that he established a nonprofit, the Private Capital Research Institute (PCRI), explicitly for that purpose. This grant funds a project by Lerner and his team at PCRI to compile a large dataset of Certificates of Incorporation (COIs). COIs filings record significant details about the provision of private capital, including information on the capital structure and key terms of venture capital deals along with important information about valuation. Though supposedly public, COIs are in practice quite difficult to obtain or study other than one at a time. Lerner and the PCRI staff will use grant funds acquire approximately 6,000 COIs and begin compile a database that tracks 20-30 variables contained in COIs. The database will then be made available to academic researchers for research.

    To set up an Administrative Data Research Facility that makes data about the private capital industry accessible to researchers

    More
  • grantee: Urban Institute
    amount: $616,926
    city: Washington, DC
    year: 2017

    To demonstrate new statistical and visualization capabilities by migrating massive microsimulation models to the cloud

    • Program Research
    • Sub-program Economics
    • Investigator Robert McClelland

    Evaluating the impact of proposed changes to the law requires predicting how people’s behavior will change in response to this or that policy change. These predications are made using microsimulations. Researchers compile data from a representative sample of the population, run models that estimate what those individuals will do in response to changes in, say, the tax code, and then aggregate the results. This is a traditional tool not just for economists but also for the study of traffic, finance, epidemiology, and crowds. The problem with microsimulations, however, is that they are computationally unwieldy. Running a sophisticated model requires lots of time and computing power. Funds from this grant support efforts by Robert McClelland at the Urban Institute’s Tax Policy Center (TPC) to take the next big step in microsimulation by harnessing the power of cloud-based computing. McClelland will move the TPC’s existing tax policy evaluation microsimulation models to the cloud, allowing the models to both be run faster and to allow multiple simulations to be run at once. This will make it routinely practical, for example, to see how robust results are to changes in parameter choices, to evaluate many different policy options and see which works best, and to handle nonlinearities due to thresholds in the tax code where different rules kick in or out. Basic statistical tasks—like obtaining variances, building confidence intervals, or testing hypotheses—should run in a matter of hours rather than months. These new capabilities will greatly enhance how useful TPC’s models are for rapidly understanding proposed changes in the tax code. The TPC team will then test these new capabilities by investigating three specific research questions: How does uncertainty in growth rates and recession timing affect projected tax revenues? How does sampling variation affect model behavior? And how can tax policies improve distributional outcomes without reducing revenue? Lastly, TPC will also launch an interactive website where the public can explore and visualize tax plans of their own design in real time.

    To demonstrate new statistical and visualization capabilities by migrating massive microsimulation models to the cloud

    More
  • grantee: Stanford University
    amount: $480,854
    city: Stanford, CA
    year: 2017

    To develop, test, and post new algorithms for estimating heterogeneous causal effects from large-scale observational studies and field experiments

    • Program Research
    • Sub-program Economics
    • Investigator Susan Athey

    This work funds methodological work by economist Susan Athey, who is aiming to develop rigorous new statistical algorithms that will allow machine learning programs to isolate causal relationships in large, complex datasets. Athey is building special new tools to handle methodological tasks that economists care about but often find challenging. These include novel techniques for taking heterogeneity into account while estimating treatment effects, calculating optimal policies, and testing hypotheses in very large and varied populations. Athey’s focus will be on computing algorithms that are particularly useful for evaluating policy interventions and that enable one to isolate how policy changes differentially affect the behavior of heterogeneous populations. As a result of her work, she expects to publish several pieces in peer reviewed statistical and econometric journals and all the algorithms, code, documentation, and nonproprietary data Athey and her team generates will be made freely available to other researchers.

    To develop, test, and post new algorithms for estimating heterogeneous causal effects from large-scale observational studies and field experiments

    More
  • grantee: NumFOCUS
    amount: $684,185
    city: Austin, TX
    year: 2017

    To develop a programming toolkit for the construction, execution, and evaluation of macroeconomic simulations where heterogeneous agents interact behaviorally

    • Program Research
    • Sub-program Economics
    • Investigator Christopher Carroll

    Though it has been ten years since the Great Recession, the comprehensive macroeconomic models in use at central banks, government agencies, and other large financial institutions are not noticeably improved from a decade ago. Conversations with leaders of those institutions point to two fundamental flaws in traditional models, namely, the assumptions about representative agents and about rational expectations. These imply not only that the economy evolves as if there is only one consumer and only one firm but also that the consumer and the firm make optimal decisions based on predictions that are realized. Why are macroeconomists so reluctant to give up these stultifying assumptions? Because as hard as it is to run models with those assumptions, it is nearly impossible to compute much without them. Chris Carroll of Johns Hopkins University wants to fix this situation. While serving as chief economist at the Consumer Financial Protection Bureau (CFPB), he started constructing an open source computational tool kit for macroeconomists that can specifically handle non-rational heterogeneous agents. The platform, the Heterogeneous Agents Resources Kit (HARK), is capable of modeling how microeconomic interactions among heterogeneous agents can lead to macroeconomic outcomes different from those predicted by traditional techniques. It is also possible to assign less-than-rational behaviors—such as hyperbolic discounting, anchoring, or herding—to parts of the population. Running simulations under those circumstances can reveal phenomena that traditional models can neither explain nor even generate. This grant provides three years of support to Carroll as he further expands and develops HARK and creates tools to facilitate its use.

    To develop a programming toolkit for the construction, execution, and evaluation of macroeconomic simulations where heterogeneous agents interact behaviorally

    More
  • grantee: Innovations for Poverty Action
    amount: $660,365
    city: New Haven, CT
    year: 2017

    To study the behavioral welfare economics of potential interventions in four kinds of critical consumer decisions

    • Program Research
    • Sub-program Economics
    • Investigator Hunt Allcott

    This grant funds a project by Hunt Allcott, Dmitry Taubinsky, and Jonathan Zinman to four common kinds of consumer decisions and then use those models to analyze the welfare implications of potential policy interventions aimed at altering these decisions. They plan to examine supposed “mistakes” people make making decisions about sugar-sweetened beverages, credit card borrowing, checking account overdrafts, and college enrollment. In each context, the research team will start by formulating a theoretical model that can accommodate a range of consumer behaviors. Next, they will perform empirical analyses using experimental, quasi-experimental, and survey designs to identify biases and test predictions. Then they will analyze the empirical welfare implications various regulatory or other interventions aimed at altering consumer choices in these areas. In addition to covering data collection costs, grant funds will support a research assistants and a single project manager for all four studies.

    To study the behavioral welfare economics of potential interventions in four kinds of critical consumer decisions

    More
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website.