One of the more surprising difficulties of working with big data—more than a few hundred gigabytes—is the sheer difficulty of moving it from place to place. Though the price of cloud and local computing has dropped and the availability of bandwidth has increased, the standard protocols for transferring data over the Internet (http and ftp) simply start to break down at that scale. Errors multiply, requiring laborious file integrity checking and repetitive restarting of transfer operations. There is, as yet, no satisfactory solution to the simple yet thorny issue of moving meso? and larger scale data from one computer to another. Globus, a data management tool developed by a team at the University of Chicago’s Computation Institute, offers a promising solution to these problems, allowing the seamless transfer of large datasets with none of the drawbacks of existing methods. The project is currently pivoting from support by grant funding to a sustainable nonprofit business model based on both individual and institutional subscriptions (and has already signed up six major universities as charter members). However, it is facing a catch?22: The team needs robust marketing and customer support capacity to build up a customer base, but without a customer base they will not have the funds to provide marketing and customer support. Funds from this grant provide temporary bridge funding to the Globus platform, enabling the project to provide top quality service while it builds a customer base and moves towards independent sustainability.