Quick viewing(Text Mode)

Dark Energy Survey on the OSG

Dark Energy Survey on the OSG

Ken Herner OSG All-Hands Meeting 6 Mar 2017

Credit: T. Abbott and NOAO/AURA/NSF The : Introduction

• Collaboration of 400 scientists using the Dark Energy Camera (DECam) mounted on the 4m Blanco telescope at CTIO in Chile • Currently in fourth year of 5-year mission • Main program is four probes of dark energy: – Type Ia Supernovae – Baryon Acoustic Oscillations – Galaxy Clusters – Weak Lensing • A number of other projects e.g.: – Trans-Neptunian/ moving objects

2 Presenter | Presentation Title 3/6/17 Recent DES Science Highlights (not exhaustive) • Cosmology from large-scale galaxy clustering and cross- correlation with weak lensing • First DES quad-lensed quasar system • discovery (second- most distant Trans-Neptunian 4 object) • Optical Follow-up of GW triggers

3 Presenter | Presentation Title 3/6/17

Figure 1. DES gri color composite discovery image (top left), SExtractor segmentation map plotted over the DES i-band coadded image (bottom left), Gemini i-band acquisition image (top right), and the spectroscopic slit layout (bottom right). The central red lensing galaxy is G1, the three blue lensed quasar images are A, B, and D, a fourth red image is G2, and the redMaGiC galaxy is G0. The 5 components (A,B,D,G1,G2) of the system are not fully separated in the DES catalog, with D+G1 and B+G2 remaining blended.

lar DES Y1 blue-near-red and related searches, we also nitudes were obtained by cross-matching the DES and see other cases in galaxy group and cluster environments VHS catalogs using a 1.500 search radius. Component where the lensing system is being indirectly found this D+G1 does not have a J-band magnitude value due to way (Diehl et al. 2017, in preparation). blending with B+G2 in this band. For the mid-infrared The coordinates and photometry of the three DES magnitudes, the DES and WISE catalogs were cross- catalog objects are given in Table 1,whilewepresent matched using a 4.000 search radius. All components of in Agnello et al. (2017) a more detailed analysis that the system are blended into a single WISE source, which provides the deblended positions and magnitudes of all is why the same values of the W 1 and W 2 magnitudes five components in this system. The DES SExtractor are displayed in the table. auto magnitudes of the three catalog objects are listed After DES J0408-5354 was first announced to the in Table 1, as are the near-infrared Kron magnitudes STRIDES Collaboration as a lensed quasar candidate (Kron 1980) from the VISTA Hemisphere Survey (VHS; from the blue-near-red search, a number of other search McMahon et al. 2013) and the mid-infrared magnitudes methods within STRIDES were examined and also seen from the Wide-field Infrared Survey Explorer (WISE; to identify the system as a candidate. These other search Wright et al. 2010). For the near-infrared catalog, mag- methods are described in more detail in Agnello et al. Overview of DES computing resources • About 3 dozen rack servers, 32-48 cores each, part of FNAL GPGrid but DES can reserve them. used for nightly processing, reprocessing campaigns, and deep coadds (64+ GB RAM )using direct submission from NCSA. • Allocation of 980 "slots" (1 slot = 1 cpu 2 GB RAM) on FNAL GPGrid, plus opportunistic cycles • OSG resources (all sites where Fermilab VO is supported) • NERSC (not for all workflows) • Various campus clusters • Individuals have access to FNAL Wilson (GPU) Cluster – Difficult to run at scale due to overall demand • By the numbers: – 2016: 1.98 M hours; 92% on GPGrid – 2.42 M hours last 12 months; 97% GPGrid – Does not count NERSC/campus resources – Does not count NCSA->FNAL direct submission

4 Presenter | Presentation Title 3/6/17 Overview of GW EM followup Trigger information Other EM LIGO/Virgo Trigger: to partners; partners' probability map, results shared followup distance, etc. partners

Provide details of any candidates for spectroscopic followup by other partners Trigger information Process Gamma-Ray from LIGO Images, Coordinates Analyze results Network (GCN) DES-GW Report area(s) group observed Formulate plan, take Observe ? Formulate Combine trigger observations information observing plan : wait for from LIGO, Inform DES management of desire to follow up; the next one source detection take final decision with them probability maps

5 Presenter | Presentation Title 3/6/17 Difference Imaging Software (GWThe Follow processing-up job and here onTNOs) Fermigrid then, is to process these images, subtract off another picture of the same • Originally developed for Supernova piece of sky already in hand from previous observations, and then look (via a computer algorithm) for a spot of light on the difference studies Image. This is a GW counterpart: A flash corresponding to a chirp: • Several ways to get GW events

• DES is sensitive to neutron star mergers or BH-NS mergers (get an optical counterpart), core collapse • Main analysis: use “difference imaging” pipeline to compare search images with same piece of sky in the past (i.e. look for objects that weren’t there before)

6 Presenter | Presentation Title 3/6/17 Image analysis pipeline • Each search and template image first goes through “single ” processing (few hours per image). About 10 templates per image on average (some overlap of course) – New since last AHM: SE code somewhat parallelized (via joblib package in Python.) Now uses 4 cpus and 3.5 – 4GB memory; up to 100 GB local disk. Run time is similar or shorter despite additional new processing/calibration steps. – Increased resource requirements don't hurt as much because memory per core actually went down. • Once done, run difference imaging (template subtraction) on each CCD individually (around 1 hour per job, 2 GB RAM, ~50 GB local disk) • Totals for first event: about 240 images for main analysis *59 CCDs per image (3 unusable) over three nights = about 5000 CPU-hours for diffimg runs needed per night – Recent events have been similar • File I/O is with Fermilab dCache

7 Presenter | Presentation Title 3/6/17 The Need for Speed • 6k CPUs is not that much in one day, but one can’t wait a long time for them. Want to process images within 24 hours (15 is even better) allowing DES to send alerts out for even more followup while object is still visible. First event was over a longer period. • Necessitates opportunistic resources (OSG); possibly Amazon/Google at some point if opportunistic resources unavailable – Did a successful AWS test last summer within FNAL HEPCloud demo

8 Presenter | Presentation Title 3/6/17 Current GW Follow-up Analysis

• Followup analysis ongoing for most recent public LIGO trigger. • OSG job fraction somewhat higher than last year (increased GPGrid usage by FIFE/DES? More multicore jobs? Both?)

Peak OSG fraction about 40%

9 Presenter | Presentation Title 3/6/17 Current GW Follow-up Analysis

• Followup analysis ongoing for most recent public LIGO trigger in January. • OSG job fraction somewhat higher than last year (increased GPGrid usage by FIFE/DES? More multicore jobs? Both?)

Peak OSG fraction about 40% Here: SE jobs only (4 cpu)

10 Presenter | Presentation Title 3/6/17 Dwarf Planet Discovery • DES researchers found dwarf planet candidate 2014 UZ224 (currently nicknamed DeeDee) – D Gerdes et al., https://arxiv.org/abs/1702.00731 • All code is OSG ready and is basically the same as SE + diffimg processing with very minor tweaks – After diffimg identifies candidates then other code makes "triplets" of candidates to verify that the same thing's seen in multiple images – Main processing burst was July-August when FNAL GPGrid was under light load, so >99% of jobs ended up on GPGrid • Required resources would have exhausted NERSC allocation; FNAL w/OSG as contingency was only option

Made at Center site: http://www.minorplanetcenter.net/db_search/show_object?utf8=%E2%9C%93&object_id=2014+UZ224

11 Presenter | Presentation Title 3/6/17 Future Directions • Have written tool to determine template images given only a list of RA,DEC pointings, and then fire off single-epoch processing for each one (run during day before first observations) • Incorporate DAG generation/job submission script into automated image listener (re-write in Python?) so everything is truly hands-off • Working on ways to reduce job payload (we are I/O limited) – A few more things now in CVMFS – Not sure cache hit rates would be high enough for StashCache to help with input images • Applying same techniques to Planet 9 search

12 Presenter | Presentation Title 3/6/17 Additional Workflows on the OSG

• Several new workflows now OSG-capable – SN analysis: < 2 GB memory, somewhat longer run times due to heavy I/O (tens of GB) requirements. Nearly always run on GPGrid now, but not necessary – Simulations (some fit into usual 2 GB RAM slots) – Other workflows require 4-8 GB memory; being run at FNAL right now. Not a requirement but difficult to get such high-mem slots in general • Other workflows include: – Deep learning in galaxy image analysis – COSMOSIS (cosmological parameter estimation): http://arxiv.org/abs/1409.3409

13 Presenter | Presentation Title 3/6/17 OSG benefits to DES • When it works, it's great! • Biggest issues are the usual pre-emption, network bandwidth – Most DES workflows (at least so far) are very I/O limited: some workflows transfer several GB of input – expect StashCache to somewhat mitigate the problem, but only to a point – Still have to copy a lot of images around (currently most SW doesn't support streaming) • HPC resources may make more sense for other workflows (though having easy ways to get to them is really nice!) – Some analyses have MPI-based workflows. Works well when able to get multiple machines (not set up for that right now) • Strong interest in additional GPU resources. DES will work with FIFE expts on common tools for OSG GPU access

14 Presenter | Presentation Title 3/6/17 Summary

• Lots of good science coming out of DES right in multiple areas • OSG is and will be an important resource provider for the collaboration • Opportunistic resources are critical for timely GW candidate follow-ups and TNO searches (i.e. Planet 9) • Trying to get additional workflows on to OSG resources now • Very interested in additional non-HTC resources (MPI and GPUs especially.) Credit: Raider Hahn, Fermilab OSG could be a great resource provider here

15 Presenter | Presentation Title 3/6/17 16 Presenter | Presentation Title 3/6/17 Dataflow and Day-to-Day Operations With Grid Resources

• Dedicated ground link between La Serena and main archive at NCSA (transfer is a few minutes per image) • Nightly processing occurs at FNAL – Submitted from NCSA to FNAL GPGrid cluster via direct condor submission – Reprocessing campaigns (additional corrections, etc.) underway at FNAL

17 Presenter | Presentation Title 3/6/17 STANDARD SIRENS

Compact Binary Coalescence

Motivation for OpticalGW follow-up of GW events • The “golden channel” is merger of two neutron stars, with the GW component detected by LIGO and the EM component detected by a telescope • If one can observe both the GW and EM component,distance it opens EMup a lot of opportunities CBC GW gives distance EM counterpart gives redshift (from host galaxy) Together they give a newredshift way to measure Hubble parameter

CBC = Compact Binary Coalescence

Marcelle Soares-Santos ◆ DECam18 SearchPresenter for| Presentation an Optical Title Counterpart to GW150914 ◆ Fermilab Wine & Cheese3/6/17 ◆ Feb 12 2016 3 Event Localization

• Similar to how our ears work • With 2 detectors area can still be hundreds of sq. deg. • With Virgo detector, would be localized to few tens of sq. deg.

Merger event

Hanford “Ear” Livingston “Ear”

Arrival time delay ~few milliseconds

Possible M. Soares-Santos Locations of event

19 Presenter | Presentation Title 3/6/17