Unlocking the Milky Way's Secrets: How New Tech Bridges Theory and Observation
"A Powerful Tool for Comparing Galactic Models and Observational Data"
The past decade has witnessed remarkable advancements in both the theoretical understanding and observational mapping of the Milky Way. Extensive photometric surveys like 2MASS and SDSS have charted significant portions of our galaxy, while spectroscopic surveys such as GCS, SEGUE, and RAVE have provided radial velocity and spectral data for a multitude of stars. Furthermore, ambitious projects like LSST, GAIA, and SkyMapper promise an unprecedented wealth of new observations.
This surge in data, characterized by both the sheer number of stars cataloged and their distribution across the sky, has created a pressing need for efficient tools that can bridge the gap between theoretical models of our galaxy and the ever-expanding observational landscape. To facilitate this crucial comparison, theoretical models must be translated into the observational domain and then reconciled with inherent observational uncertainties. The Galaxia code is specifically designed to address this challenge.
Galaxia uses efficient and fast algorithms for creating a synthetic survey of the Milky Way. Both analytic and N-body models can be sampled by Galaxia. For N-body models, a scheme is presented that disperses the stars spawned by an N-body particle, in such a way that the phase space density of the spawned stars is consistent with that of the N-body particles.
Galaxia: Bridging Theory and Observation
Galaxia employs the von Neumann rejection technique to sample analytic models effectively. This method is well-suited for continuous sampling across multidimensional spaces. Recognizing that a naive approach can be computationally expensive, the code divides the galaxy into a set of roughly equal mass nodes and applies rejection sampling to each. This division allows for optimization; for instance, depending on the distance of the node, a suitable lower limit can be set on the mass of the star to be generated, boosting processing speed.
- Efficiency: Speeds of about 0.16 million stars per second can be reached on a single 2.44 GHz CPU, higher for more shallow surveys.
- Comprehensive Surveys: A V < 20 magnitude limited survey of the NGP, covering 10,000 square degrees and consisting of about 35 million stars, can be generated in 220 seconds.
- Large Scale Applications: A V < 20 all sky GAIA like survey would require about 6 hours on a single CPU.
Galactic Archeology with Velocity Information
Galaxia can be used to identify structures in the stellar halo. In the currently favored ACDM paradigm of structure formation the stellar halo is thought to have been built up by accretion of satellite galaxies.
In order to objectively identify the structures we use the multi-dimensional group finding algorithm EnLink. The advantage of using EnLink being that it can work in spaces of arbitrary dimensions. Moreover, it has an in built significance estimator such that only genuine groups that stand out above the Poisson noise are considered.
As one adds radial velocity and proper motion information, we discover more groups and sample a larger number of unique accretion events. Specifically in the distance range 20–70 kpc, when velocity information is added, there is a potential for discovery of a large number of new structures.