Podcaster: Rob Sparks, and Chuck Claver
Original podcast has been aired on June 17th, 2011 : https://cosmoquest.org/x/365daysofastronomy/2011/06/17/june-17th-the-data-from-the-lsst/
Description: The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope currently under development that will be built in Chile with first light expected in 2018. It’s large field of view and 3.2 gigapixel camera will enable astronomers to image the entire night sky visible from Cerro Pachon every three nights. In this podcast, Chuck Claver, LSST systems engineer, discusses the telescope and how the copious amounts of data it produces will make its way from the telescope to members of the scientific community, educators, students and the general public
Bio: Rob Sparks is a science education specialist in the Education and Public Outreach (EPO) group at the National Optical Astronomy Observatory (NOAO) and works on the Galileoscope project (www.galileoscope.org), providing design, dissemination and professional development. He blogs at halfastro.wordpress.com.
Chuck Claver is a systems engineer at LSST
Rob: Hi, this is Rob Sparks from the National Optical Astronomy Observatory and I would like to welcome you to this episode of the 365 Days of Astronomy podcast. I am sitting here this morning with Chuck Claver of the National Optical Astronomy Observatory and the Large Synoptic Survey Telescope. Good morning, Chuck.
Chuck Claver: Good morning
Rob: First of all, I would like you to tell us a little about yourself and what you do here at NOAO and for the LSST.
Chuck: I am a staff scientist for the National Observatory and I am also on loan so to speak to the Large Synoptic Survey Telescope project. Within the LSST project I am the systems engineer responsible for making sure all the bits and pieces of the LSST come together. I am also a research scientist and have a lot of interest in the results the LSST will produce.
Rob: We are going to talk about how data will get from the LSST to a person’s computer, but first tell us a little bit about what the LSST is and what it will do.
Chuck: The LSST is a dedicated survey telescope similar in spirit to the Sloan Digital Sky Survey the preceded the LSST with the difference that the LSST will add a time domain element to the all sky survey. The LSST is an 8.4 meter telescope with an interesting three mirror optical configuration that feeds a 3.2 gigapixel camera and that camera will cover a circle that is approximately 3.5 degrees on the sky. That is the equivalent of seven full moons on the sky. And with that the LSST will be capable of covering everything you can see in the night sky from a single location every three or four nights and it will do this repeatedly over and over again for a period of about ten years. Every image that comes off the LSST will be compared with images that have been taken previously and the software will analyze the differences in all objects seen in brightness and changes in position. All that information is then published out to the research community, I will get back to that in a bit, and then every year all the data that has been taken by the LSST get analyzed as a static catalog and a static image set and all that data is also available to our user community.
Rob: So where is this telescope going to be build and what is the current construction schedule?
Chuck: The telescope itself will be located in northern Chile on the southern edge of the Atacama Desert on Cherro Pachon right next door to the Gemin South and the SOAR telescope. The construction schedule, if things go well, we expect to start construction in fiscal year 2014 which would be October 2013 and that would put us on track to having first light in 2018.
Rob: I know you already have the mirror underway because I have already seen that over at the mirror lab at the U of A(Arizona). Anyway, let’s start with the journey of the data. The camera takes the exposure and the data is read out. What happens at that point?
Chuck: At that point the data is read out and the pixels come out in about two seconds from the time the shutter is closed. That data then flows down from the center of Cerro Pachon to a base facility in the town of La Serana and there we have designed a significant computing facility to process the image data, to produce the alerts, to check for variable phenomena that I just mentioned, and that gets processed in sixty seconds. So all the objects that have been identified as transients or variables or things that have changed since we last looked at that part of the sky get published out to the user community sixty seconds after the exposure was taken. From there the data are streamed over the course of a day up to our archive center which is at the National Center for Supercomputing Applications in Champaign-Urbana. At the archive center the data are then collected over time. They get reanalyzed every 24 hours to produce that night’s photometric catalog and then over a period of time the data get processed as a collection and we do what we call our annual data release. That data release produces what we call object catalogs and stacked images that are available to our user community.
Rob: So the alerts are out after sixty seconds. How long would it take for me to get the images and other data from the telescope?
Chuck: The data from the telescope are available within 24 hours after being taken to the general user community. So we go from the summit to the base to the archive center, and then we also have what we call our data access centers. Those are really the portals for the users to access the data. And there will be a co-located center at the archive center at Champaign-Urbana and another one down in Chile at the town of La Serena. That access center is meant to service the Chilean community who have a ten percent partnership in the project because we are located in Chile.
Rob: I understand you are going to have a data access center dedicated to education and public outreach for the public rather than for hard core scientists, right?
Chuck: That’s correct. There will be a headquarter facility located somewhere in the continental U.S. We haven’t decided where that facility will be located but at that facility will be the Education and Public Outreach Data Access Center that is slightly different than the science data access center because of the special needs of the EPO program.
Rob: So why is it important for all this to happen so quickly?
Chuck: There are several layers to answer that question. First of all the data that comes off the LSST, the quantity is so vast that we can’t afford to lag behind its processing. So to give a scale for what we expect to come off the LSST in any given 24 hour cycle there will be 15 terabytes of raw image data from the LSST. Over the course of a ten year survey we will develop an archive of seventy five petabytes of image data and 25 petabytes of catalog data, so we have to keep up! So one of the things that drives us to a 24 hour reprocessing of the previous night’s data is just to keep up. As far as that transient alerts go, getting those out in sixty seconds, it’s to enable other facilities to follow up on short lived variable objects or transient objects. We know they are out there. A gamma ray burst is one example. There are many other things we have seen and we don’t have the slightest clue as to what they are so we want to give the opportunity to other observatories to rapidly follow up on things that LSST sees.
Rob: So on a personal level, based on your research interest, what is the most practical application to you?
Chuck: My personal research interest is in the nearby solar neighborhood and in particular the stellar remnants of white dwarf stars. One of the things that the LSST will do that other surveys really haven’t done well, by adding the time domain, it will become and astrometric machine. Astrometry is the precise measurement of the position of stars in the sky. By having a ten year baseline and repeated visits or repeated observations that LSST will have, we will be able to measure proper motions of stars and their parallaxes and that will allow us to select a volume limited sample of white dwarf stars, which has always been a very difficult task to do. From that then, we use the white dwarfs in what we call stellar archaeology and investigate the history of star formation in the local stellar neighborhood.
Rob: Great, thanks for joining me today, Chuck.
Chuck: Your welcome.
Rob: This is Rob Sparks, thanks for listening to this episode of the 365 Days of Astronomy podcast. I will see you again next month.
End of podcast:
365 Days of Astronomy
The 365 Days of Astronomy Podcast is produced by Astronomical Society of the Pacific. Audio post-production by Richard Drumm. Bandwidth donated by libsyn.com and wizzard media. You may reproduce and distribute this audio for non-commercial purposes. Please consider supporting the podcast with a few dollars (or Euros!). Visit us on the web at 365DaysOfAstronomy.org or email us at info@365DaysOfAstronomy.org. This year we will celebrates the Year of Everyday Astronomers as we embrace Amateur Astronomer contributions and the importance of citizen science. Join us and share your story. Until tomorrow! Goodbye!