Matthew J Graham
I'm a Research Professor of Astronomy at the California Institute of Technology and the Project Scientist for the Zwicky Transient Facility (ZTF), the first of a next generation of time-domain sky surveys producing hundreds of thousands of public transient alerts per night. I have previously worked on the Catalina Real-time Transient Survey (CRTS), a still unmatched data set in terms of temporal baseline coverage; the NOAO DataLab; the Virtual Observatory; and the Palomar-Quest Digital Sky Survey.
My main research interests are the application of machine learning and advanced statistical methodologies to astrophysical problems, particularly the variability of quasars and other stochastic time series. This is in part to deal with the unprecedented volumes of data that 21st century astronomy is generating but also to expand our ability to work with complex systems of information beyond simple correlations.
What did we get right? Lessons learned from the first 300 million alerts of ZTF
The Zwicky Transient Facility (ZTF) has been serving transient alerts to the astronomical community for 2.5 years and it has just passed the 300 million mark, announcing the detection of almost 5,000 supernovae, 25 tidal disruption events, numerous asteroids, and other astrophysical phenomena in that time frame. The Rubin Observatory will reach this milestone within the first three months of alert generation and efforts are underway to ensure that the necessary infrastructure for alert dissemination, reception, and response is in place. We are building, however, on concepts envisaged well over a decade ago and in this talk, I will review the trials and tribulations of ZTF in seeding such a framework and consider what might we have done differently. In particular, our concerns regarding scale and followup capability merit revisiting and potentially a new vision developed of what the landscape might actually look like during the era of LSST operations.