Winners of Neukom Graduate Fellowships have been announced for the 2010-2011 academic year. Fellowships will provide a full year of funding, including stipend and benefits, to Ph.D. students engaged in faculty-advised research in the development of novel computational techniques as well as the application of computational methods to problems in the Sciences, Social Sciences, and Humanities.
The 2010-2011 winners are:
We propose a framework for cosmological simulations that runs on graphics cards using the Open Computing Language (OpenCL). Simulations of scalar fields that are carried out using traditional CPUs can be greatly enhanced by the use of the immense inherent parallelism of modern GPUs. gLattice will provide a mechanism to harness the power of GPUs in cosmological simulations, abstracting away the implementation details from the end user. Such simulations will thus be faster and will require fewer resources than traditional clusters of CPUs. The platform independence of the OpenCL language will make the framework compatible with an already large number of compliant graphics cards. Preliminary data show that, using gLattice, one can achieve almost an order of magnitude speed-up over a powerful CPU by using a modern, moderately priced graphics card. Recently published results from this research.
Marcelo Gleiser, Noah Graham and Nikitas Stamatopoulos, Generation of Coherent Structures After Cosmic Inflation, Phys. Rev. D83, 096010 (2011) [arXiv]
Complex systems arise in practically all fields; examples include the correlationnetworks of equities or the networks of over the counter (OTC) derivative liabilities, social networks like Facebook and Twitter, the evolution of ideas in constitutions, and various biological networks. These systems are dynamic making traditional assumptions of an independent static system unrealistic leading to incorrect conclusions and bad predictions.
In this research we propose to develop statistical tools which analyze dynamic complex systems. Specifcally, we will explore nonparametric methods (both Bayesian and frequentist) to help uncover the underlying structure to explain the observed phenomena as well as provide predictive power for future observations. The data streams underlying these complex systems are generally quite large so that the development of these techniques in the context of so-called “massive data sets" (especially in the social and financial realm) are of particular interest and we will develop efficient inference algorithms that will scale to handle the amount of data encountered in both research and commercial settings.
Last Updated: 3/25/13