A WORKSHOP ON “SCIENCE WITH THE WILLIAM HERSCHEL TELESCOPE 2010-2020”
Burlington House, London, UK, 22-23 March 2010

QUESTIONS AND ANSWERS

Talk: Introduction
Speaker: Marc Balcells

Question 1 (Don Pollacco): Is the funding timescale reasonable in the current climate?
Answer 1: This is hard to guess now. We will not be asking for funds from the Agencies this year. Next year, effort required will mostly be in the areas of design. Instrument teams contributing to the WHT MOS may find several ways to exploit synergies with E-ELT, VLT and GTC design work. The big hardware costs would come in a few years time, when plausibly the financial climate will have improved.

Question 2 (Don Pollacco): Isn't the ING putting all its eggs in one basket with a single instrument proposal?
Answer 2: I agree with you that putting all our eggs in one basket would be a mistake. At ING we believe our support for diversity is one of our strengths, and we hope to continue to support such a model in the future. At the same time, if we want to adapt to the challenges of astronomy in the next decade, we need new instrumentation. Our strategy has a goal to strike the right balance between serving our community with large surveys and with classical observing. As I said in my talk, we need you, the community, to help us define what exactly the balance is.

Question 3 (Craig Mackay): Even if the spectrograph is funded, there will be a 6 year gap where the nibbling away of ING’s budget and sniping at ING science will continue. The INT /WHT need a big niche activity to cover these years.
Answer 3: We will have HARPS, but we do need more - either a new instrument and/or a new application.

 

Talk: MOS Science on a 4-m telescope
Speaker: Amina Helmi

Question 1 (Gavin Dalton): Would you be able to use a higher multiplex at R ~ 20,000, in terms of available targets?
Answer 1: Yes, for a multiplexity of ± 1000 there are enough targets (a large fraction of the fibres could be filled by disk stars, rather than halo stars).

Question 2 (Johan Knapen): What instruments will, or may be, available in the Southern hemisphere, and related to this, how crucial will (additional) coverage of the Northern hemisphere be: useful extra constraints or 100% crucial for discriminating between models?
Answer 2: Currently the only funded instrument is HERMES at the AAT, but it looks like it won’t go deeper than V ~ 14. We (the GAIA community) are also pushing for a wide field MOS on an ESO telescope (possibly VISTA). It is crucial to have coverage of both hemispheres simply because the Milky Way is a whole sky object: from the south we can probe the inner disk (and determine its history), while from the north we probe the outer disk. It is therefore important to cover both hemispheres.

Question 3 (Rob Jeffries):  What are the parameter precisions (RVs and chemical abundances) that are expected from R ~ 20,000 spectroscopy?
Answer 3: This depends also on S/N (see talk by Mike Irwin). With FLAMES in the HR mode (R ~ 20,000) we get radial velocities to ± 0.2 km/s and chemical abundances to 0.1-0.15 dex.

 

Talk: Cosmology surveys
Speaker: Ofer Lahav

Question 1 (Alan Fitzsimmons): Although some overlaps exist, surely the spectroscopic requirements for w-determination are different to those for galactic structure surveys?
Answer 1: Yes I think that one needs a brainstorming session to see whether one can accommodate both. The best analogy to this discussion is WFMOS. In WFMOS proposals there were both components – the BAO and galactic archaeology. However that was supposed to be done on 8m telescopes so there is a difference there. I agree that there is a different specification and a different type of science and you have to decide whether you can accommodate both or you wish to have one which will lead the way given the landscape of what else is happening.

 

Talk: PAU narrow-band cosmology survey
Speaker: Francisco Javier Castander

Question 1 (Tom Marsh): When will PAU be ready?
Answer 1: In 2 years.

Question 2 (Kirpal Nandra): What depths will be reached and over what survey area?
Answer 2: The PAU camera at the WHT prime focus with its current corrector would be able to sample  ~ 2deg²/night in 40 narrow-band and five broad-band filters down to the magnitude limit of mAB ~ 23.

Question 3 (Janet Drew): I had heard that the PAU camera was going to go on a new telescope in Aragón. Has that changed?
Answer 3: The original plan was to build the PAU camera for the telescope that is planned to be built in Javalambre (Aragón) by CEFCA. However, we could not reach an agreement and now the PAU camera is being designed and built for the WHT at its prime focus with its current corrector.

Question 4 (Rafael Rebolo): Will you provide broad-band filters for your camera? What is your spatial pixel scale?
Answer 4: Yes, the PAU camera will have broad band filters. The pixel scale is 0.27”/pix.

 

Talk: Follow-up of radio surveys
Speaker: Huub Röttgering

Question 1 (Paul Groot): What are the complementary optical/NIR needs for the LOFAR Transient case, and the use of the WHT/INT in this area? What would be the requirements and the frequency you would need this instrument/telescope?
Answer 1: For follow-up of LOFAR transients, rapid response both with an optical and near-IR imager and a single object spectrograph is essential. Access to medium and large telescopes (ie. INT+WHT+Grantecan) is a big plus. MOS facilities are less relevant.

Question 2 (Don Pollacco): When LOFAR was devised, what plan did you have for follow-up?
Answer 2: [To be completed]

 

Talk: Follow-up of imaging surveys
Speaker: Mike Irwin

Question 1 (Boris Gänsicke): Is it technically feasible to have HR & LR spectrographs, and split the fibres/targets accordingly?
Answer 1: It is and even all within one spectrograph. For example the design I was involved in for WFMOS on Subaru had exactly such a design with a 3 or 4:1 downsizing of numbers of fibres going from low to high-resolution. The salient technical drivers here are numbers of targets and detector acreage and the downsizing allows a match between 3-4 times as many fibres for low-resolution and the 3-4 times as many resolution elements for high-resolution.

Question 2 (Gavin Dalton): How much simultaneous spectral coverage do you need at high resolution? I think we can do lambda/6 (i.e. ~100nm).
Answer 2: A few 1000 Angstroms e.g. 4800-6800A.

Question 3 (Reynier Peletier): Obtaining very high resolution spectra is relatively expensive. If you have a large wavelength coverage, what is the minimum spectral resolution that you need?
Answer 3: It depends on what your science aims are but for the majority of stellar work a minimum resolution of R=5000 is necessary to give ~2km/s velocity accuracy and allow bulk abundance determination to 0.1- 0.2 dex. For more detailed, so-called chemical fingerprinting, work R=20000 is required to enable useful individual elemental abundance determinations.

Comment (Rob Jeffries): There is also a GAIA working group on clusters/associations and star formation that is meeting in May (Catania). The GAIA pre- and post-survey requirements are likely to be very similar i.e. R>~20000 in the red (5000-9000A) to obtain chemical abundances to ~0.1 dex and RVs to <~0.5 km/s.

 

Talk: WHT and the next generation of wide-field transient surveys
Speaker: Mark Sullivan

Question 1 (David Carter): Just to point out that FRODOSPEC on the LT, which is admittedly a 2m rather than a 4m, has much the same specification as the IFU spectrographs which you are interested in.
Answer 1: We have started a pilot programme with Frodospec to see how faint it can go when observing SN targets, and to see if it is appropriate to use for PTF (Palomar Transient Factory) targets.

Question 2 (Paul Groot): With a transient every 20 minutes you could go to a dedicated telescope. Do you think this is feasible/desirable once we are discussing also a redistribution of the European 2-4m class telescopes?
Answer 2: A dedicated telescope is clearly highly desirable – it would probably have to be 2m. Feasibility is another issue – is there money?

Question 3 (Nial Tanvir): Do you see the time-frame of this science as being the next 5-6 years? (i.e. prior to the proposed move to wide-field spectroscopy.)
Answer 3: Yes – Both the PTF and PS-1 are already operational and will run for about the next 5 years. Of course, following these surveys, other surveys will provide a stream of transient sources. A follow-up capability will always be needed.  

 

Talk: Galaxy clusters
Speaker: David Carter

Question 1 (Marc Verheijen): How do you consider the trade-off between spatial resolution and specific grasp, i.e. light collecting power?
Answer 1: It is an important trade-off - I would not like to see the fibre site go above about 1.5 arcsec, for the optical Tully-Fisher relation in particular. GIRAFFE has 0.5 arcsec spaxels, so the same effective light grasp is 1 arcsec in the WHT.

Question 2 (Scott Trager): Just a reminder – Sanchez et al. have done IFU spectroscopy on intermediate –z clusters.
Answer 2: Yes, what becomes important then is how close you can get the buttons and how many objects you can get.

 

Talk: Nearby unresolved galaxies
Speaker: Reynier Peletier

Question 1 (?): [To be completed.]
Answer 1: [To be completed.]

Question 2 (Johan Knapen): I’d just like to make a small comment on the parameter space that you say is not covered in Sauron i.e. the high resolution and large field of view. That is already covered at the moment by another visiting instrument on the Herschel, the Fabry-Perot instrument GHAFAS. Which I think this has 10km/s separation between planes and an  ~3.5’ FOV. So I think the strength of what you propose is the combination of the multi-line, the large field and the high resolution.
Answer 2: [To be completed.]

Question 3[?]:[To be completed.]
Answer 3: [To be completed.]

 

Talk: Disks, star formation
Speaker: Armando Gil de Paz

Question 1 (Reynier Peletier): Since your field size and sampling is similar to MUSE, how does MEGARA expect to compete with it?
Answer 1: Actually our sampling is worse than MUSE, 0.685 vs. 0.2. However, there are two main factors where we will be more competitive than MUSE. MUSE’s spectral resolution ranges between 1700 (at 460 nm) and 3600 (at 930 nm), while in our case our lowest resolution is 5600 (for the entire optical range) and as high as 17000 in certain spectral setups/ranges (Halpha and CaT, so far). We will also have the full blue optical range covered at R=10000, which is the resolution needed for the analysis of massive blue stars. Another advantage here is that due to the use of image slicers (and the fact that the main science drivers of MUSE are blind cosmological surveys) the wavelength coverage of MUSE starts at 465 nm, while we start at 370 nm, with all the implications that that has on the study of ionized-gas diagnostics, the study of blue massive stars in our MW and LG galaxies, young and intermediate-aged unresolved stellar populations and even in the study of the low-z tail of Lyman alpha emitters. Finally, from GTC we can have access to M31 & M33, which are critical for the blind spectroscopic study of RGB stars in LG galaxies.

Question 2 (Marc Verheijen): R=10000 of low surface brightness objects requires the largest possible specific grasp. Spaxels of 0.6" are probably too small. For reaching a particular surface brightness level, the telescope aperture is not very relevant. As a matter of fact, large telescope apertures work again you because of the focal plane plate scale, which requires physically bigger fibres that result in lower spectral resolution.
Answer 2: I agree that the F/17 effective f-ratio of the foci of GTC does not allow the placing of very large fibres on the focal plane. We found a compromise of 100-micron-core fibres with lenslets (for the reduction from F/17 to F/3 at the entrance of the fibre) that sample the above mentioned 0.6" per fibre. However, when analyzing low-surface-brightness regions (e.g. in the outer parts of unresolved galaxies) we will combine multiple spaxels to compensate for the small spaxel size. Also note that our observing strategy for analyzing nearby galaxies has a component that is the study of resolved RGB stars in Local Group galaxies. In that case having a fibre diameter than matches the site seeing  is certainly an advantage over the large fibres commonly used on other instruments (typical also in smaller telescopes).

 

Talk: Milky Way surveys – IPHAS, UVEX
Speaker: Janet Drew

Question 1 (Don Pollacco): What about GAIA?
Answer 1: IPHAS and UVEX photometry covers much the same magnitude range as that over which Gaia will harvest astrometry. For sources towards the faint end and/or in crowded regions it is likely the ground-based photometry will provide a superior SED snapshots than Gaia can achieve. There is a lot in common between the kinds of science IPHAS and UVEX can support and that which Gaia will feed -- by the time Gaia is launched I would expect we will have already achieved a 3-D extinction map of the northern Galactic Plane, that will only get a whole heap better when the initial photometric parallaxes can be replaced by astrometric ones.

Question 2 (Marc Balcells): You are looking at the Galactic plane, wouldn’t you be better off with a NIR MOS?
Answer 2: For some applications, yes (e.g. work on star-forming regions). But there is an issue with the performance of the 4m telescopes in this regard – better to go to 8m instruments. For most applications flowing from IPHAS and UVEX the optical is more important and looks rather like the GAIA needs.

 

Talk: Galactic archaeology
Speaker: Gerry Gilmore

Question 1 (Chris Evans): Do you have a finger in the air estimate of how much the UK share of GAIA has cost?
Answer 1: About €200M through the whole project.

Question 2 (Reynier Peletier): You showed some very accurate abundances with strong conclusions. Is the accuracy as high as you claim, or are there any systematic effects?
Answer 2: In narrow ranges of temperature and gravity, differential element abundances can be as reliable as 2-3 percent. Over wider ranges of course systematics become important. The lesson is to have very large samples, so one can then do superb differential analyses.

 

Talk: Stellar studies
Speaker: Artemio Herrero-Davó

Question 1 (Francisco Najarro): How many of the key issues to be tackled in massive star science would be excluded if the MOS were to have a resolution of ~ 5000?
Answer 1: Only the work related to stellar pulsation analyses would be affected, as this requires resolutions of R-40000-20000 (at the lower end of precision radial velocity studies and at the upper end of stellar abundance analysis). The rest of the topics will remain substantially unaffected, or even this could be the right resolution for them.

 

Talk: Substellar objects and Exoplanets
Speaker: Rafael Rebolo

Question 1 (Alan Fitzsimmons): For wide-field near-IR searches for free-floating planets, are those needs covered by VISTA and UKIRT/WFCAM?
Answer 1: Certainly, these telescopes/ instruments have the potential to discover free-floating planetary-mass objects provided sufficiently deep surveys are planned in the appropriate star forming register and star clusters. Large scale surveys may be too shallow for this. A wide field imager at the WHT could carry out dedicated surveys or the mandatory follow-up work (photometry and astrometry) of potentially interesting targets found at other telescopes.

 

Talk: Time domain
Speaker: Boris Gänsicke

Question 1 (Marc Balcells): Would you benefit from going bluer than 380nm?
Answer 1: Somewhat, but 380nm would be sufficient.

 

Talk: Diffraction-limited imaging in the visible at the WHT
Speaker: Craig Mackay

Question 1 (Alan Fitzsimmons): What Strehl ratio can you achieve with 30-40% selection, and over what field of view do you get good resolution?
Answer 1: A Strehl of 15-20% and a FOV >1’, possibly 2’-3’.

Question 2 (Reynier Peletier): How is it possible that many people who worked on AO at e.g. La Palma did not go part of the way you're explaining?
Answer 2: This is a very interesting question. I find that groups already working in AO tend to be very dismissive of the lucky approach while those with no investment in it are extremely enthusiastic. The AO community know how difficult their work is and find it very hard to believe it is as easy as I have demonstrated it is. Outside the instrumentation world there is very little knowledge or interest in the technicalities of the systems and it is only now, by demonstrating actual performance on real astronomical objects that they are beginning to take notice. There is another important component which is that a lot of AO work is targeted at extreme AO particularly for the detection of exo- planets. This is an area that I really do not think that lucky imaging will play a major role because it is extremely hard. However lucky imaging plus AO is the only way to do anything at present in the visible. I should also mention that there have been lucky imaging expeditions to La Palma by our group and also by the IAC FastCam group using a commercially available detector system.

Question 3 (Nial Tanvir): Basic theory suggests to get diffraction limited imaging over a large aperture requires very high frame rejection rates – does the same theory predict the low rejection rates you find are required when using in conjunction with low-order AO?
Answer 3: Yes.

Question 4 (Paul Groot): Why not take your AO plus LuckyCam instrument straight to the VLT and not bother with the WHT directly?
Answer 4: This is what we are funded to do, but at the VLT it can only be a visitor instrument – they would want a very big project which would be expensive and lengthy, but they may go for it if our tests work as well as we hope.

Question 5 (Gavin Dalton): Does it work for a spectrograph?
Answer 5: Yes, but it’s more complicated. If you imagine using the IFUs that you’ve been talking about then you can take that same IFU and put it onto your spectrograph (just as you normally would but with much higher resolution pixels). You have another IFU that is looking at the reference star at the same resolution and you look at the output photometry (total light) from the reference star. Then you select spectra (a) when is the reference star light is concentrated in one fibre, and (b) note the location of that fibre.  That tells you everything you want to know. It tells you that you have a sharp image and it tells you which of the spectra (which fibre) you should be associating with which part of the target object.  So it will work well: you can see it’s a little more complicated but it’s not that much more complicated. But you’re not going to use your slow-scan 2kx4k CCDs; you’re going to use an electron multiplying one. But that’s not out of the question. They’re thinned, they’re good, they’re cheap and they work.

 

Talk: APOGEE: H-band multi-object spectroscopy around the galaxy
Speaker: Carlos Allende Prieto

[Questions and Answers are missing. To be completed.]

 

Talk: Introduction
Speaker: Marc Balcells

[Question/Answer missing. To be completed.]

 

Talk: Instrumentation for Cosmological Surveys
Speaker: Ray Sharples

Question 1 (Scott Trager): Could you do the same science at lower resolution? Why not just use PRIMUS?
Answer 1: At the low resolution of PRIMUS (R-40) there are significant issues with sky subtraction, especially in the red (λ>0.7 μm) and if very short slits are used to maximise the number of targets. Higher spectral resolution is also required to study the velocity dispersions of small groups (σ < 100 km/s) and redshift-space distortions.

Question 2 (Matt Jarvis): Is the spectral range and position of these fixed?
Answer 2: No, there are three (manually) interchanged grating/lens combinations in the current design: 420-520nm, 520-720nm and 720-920nm. The middle one is the most relevant for the z-7 redshift surveys discussed. A longer wavelength range (e.g. 520-850nm) would be possible at a slight loss of spectral resolution (image quality) and a reduction in the number of simultaneous spectra (longer spectra).

Question 3 (Reynier Peletier): Would you be able to go to a resolution of R ~ 5000 and what wavelength range would you get?
Answer 3: Probably not to R ~ 5000. We have looked at getting R ~ 3000 over a small spectral range around the calcium triplet (850-860nm) which looks feasible, but going much beyond this would require the use of much more expensive glasses in the spectrograph etc.

 

Talk: New ideas for highly-multiplexed spectroscopy
Speaker: Jeremy Allington-Smith

Question 1 (Scott Trager): Why not do a spectroscopic survey of the entire sky (e.g. emission-line galaxies / objects with 1 or few lines in λ-region) and down select after the fact?
Answer 1: You can’t afford the huge number of detector pixels needed (up to 1016 for an ELT).

[Question/Answer missing. To be completed.]

 

Talk: VPH-based mid-high resolution spectrograph design
Speaker: Gavin Dalton

Comment (Maria Luisa Garcia Vargas): To provide in the same spectrograph a wide range of spectral resolution we propose the sliced-grating method (invented by us) to divide the pupil into slices, smoothing the angles on each slice and then recovering the image on the detector. We have applied this method to an instrument for the GTC, ELMER (R=500-10000) and MEGARA (also for GTC) (R=5900-20000). Pupil size=160nm. (For more information see poster or contact by email at marisa.garcia@fractal-es.com). Poster at
http://www.ing.iac.es/conferences/wht201020/presentations/garcia_poster.pdf.

Question 1 (Marc Verheijen): Could the fibres be fed at f/2.8? Can you exchange the fibre slit? Considering the transmission curve, what is the fibre length?
Answer 1: (1) The design assumes the fibres are fed at f/3.6, but with a pupil image injection. This results in a small change in the projected size on the sky so 1.2”→~0.95”. (ii) Yes, the EVE design accommodates 8 slits. (iii) The fibre length is 15m.

 

Talk: Fibre positioned development for survey instruments
Speaker: Marco Azzaro

Question 1 (Gavin Dalton): How small can you make the cell size? AVS concept was dropped from EVE at the MTR as there was no obvious way to integrate IFUs.
Answer 1: If cell size refers to the reach of one fibre actuator, the answer is that we don't know exactly. We will soon start working on a concept for making the centre-to-centre distance between actuators as small as 11 mm. Our present prototype is 29.2 mm. This is NOT the cell size, which is always larger than the centre-to-centre distance (33.72 mm diameter in our current prototype), but it is the parameter which controls the size of one actuator.

Question 2 [?]: Is there an active feedback?
Answer 2: In principle it is open loop.

Comment (David King): It seems very similar to the positioner system we were going to use for the WFMOS design study we were involved in. The size of our units was much smaller than that – down to ~8-9 mm diameter. We were using ring motors to do this, so it’s a continuous not a stepper motor. I can get the details of the manufacturers for you.
Question 3 (Scott Trager): How close can you place two fibres with the prototype system?
Answer 3: I can’t remember the exact number, but it should be around a few mm centre-to-centre (around maybe 6mm).

 

Talk: 2-degree WHT PF optical correctors
Speaker: Tibor Agòcs

Question 1 (Reynier Peletier): Does your design also work in the near-infrared J and H bands?
Answer 1: Yes, it has similar image quality than in I-band, which is below 0.5 arcsec (ee50-80% encircled energy diameter) for an almost 1.8-1.9 degree FOV.

Question 2 (?): Are the surfaces spherical/aspherical?
Answer 2: [To be completed.]

Question 3 (Gavin Dalton): Is the focal plane telecentric, concentric? - What is the trade between cost and image quality?
Answer 3: 1. The designs are concentric, they have a curved focal plane. The centre of curvature is at the pupil, so that the chief rays are perpendicular to the image plane.
2. Both designs have a ee80 diameter of less than 0.5 arcsec for almost 1.7-1.8 degrees. Considerations to decrease costs have already been taken into account, but further savings could be done if we eliminate the aspherical surfaces. Naturally as a consequence the image quality will degrade. It has to be investigated what is the amount of degradation and whether it will compromise science with the proposed instrument or not.
Question 4 (Paul Groot): What will be the throughput of the corrector lenses when anti-reflection coatings are included?
Answer 4: It will be around 80-85%, or even a bit higher between 400-1000nm.

Question 5 (Fraser Clarke): What is the mass of the system?
Answer 5: The lenses weigh more or less 400-450kg of which the first lens has a weight of almost 200kg.

 

Talk: Wide-field correctors for WHT: forward-Cassegrain options
Speaker: David King

Question 1 (Paul Groot): How difficult is it to get the spectrograph down to 350nm (which will be important for any GAIA science case)?
Answer 1: Currently the low wavelength cut off is ~450nm – should be able to push it to ~ 380nm without too much difficulty. Below that it does pose problems, but with fibres do you want to get down to 350nm?

Question 2 (Marc Verheijen): What are the options (trade-offs) for a reflective spectrograph?
Answer 2: I do have designs for reflective collimators and cameras – have an f/1.0 Schmidt camera. Trade-off is obstruction (no secondary shadow to hide the detector behind) but it is likely that camera throughput of dioptric and Schmidt type camera is likely to be similar.

 

Talk: Specs for a wide-field IFU
Speaker: Scott Trager

Question 1 (Gavin Dalton): You can probably get better throughput for the large IFU at prime.
Answer 1: I agree, but I worry about the prime focus corrector – also, we need to think about the mounting at prime focus.

Question 2 (Jeremy Allington-Smith): I don’t think you need ADCs with IFUs because you can correct afterwards.
Answer 2: That’s actually true. If you can cover the whole field at once – you can either dither or you can put lenslets in - and then you basically use the wavelengths at each position to re? everything. And that’s actually what we’re doing at Calar Alto at the moment with PPAK.