Frequently Asked Questions
FAQ about the FAQs. The following gives a list of questions. Each answer occurs below in this file and has a hyperlink. Some browsers (older Netscapes) don't recognize the hyperlinks between the question and answer - just browse down the file to find it.
Problems / questions for the current version (C94 and later) of the code
Is it really necessary to read all of Hazy to find out which commands changed
between C90 and C96??
Problems /questions for older versions of the code
I made the gas very hot or put in an extra source of heat or ionization, but the
ionization did not increase above a certain level. Why? (90 and before)
Many of the commands shown in Hazy have an underscore before or after a keyword. An example is "blackbody 6 _lte". When I include the underscore the key is not recognized.
The underscore indicates a required space. The correct syntax is "blackbody 6 lte".
All of the test cases that are part of the distribution include many "assert" commands. These have nothing to do with the simulation or the astrophysics. Instead, they provide a way for the code to validate itself automatically. Assert commands are described towards the end of Part I of Hazy, and tell the code what answer it predicted in the past. If it does not obtain this answer it will print a comment. Here in Lexington the entire test suite is recomputed every night, and the assert commands provide a way to discover whether any changes have affected predictions. The assert commands can be safely removed or ignored. In fact the test suite includes a perl script, tests_remove_asserts.pl, which will remove them automatically.
The correct reference for versions 90 and 94 is the following review paper:
Ferland, G. J., 2001, Hazy, a brief introduction to Cloudy 96.00
The citation should explicitly mentions the version of the code. This includes the sub-version, 0.01 in this example, and also the results of any hot fixes that have been applied. In this example the hot fixes brought the sub-version up to a. (This last part only changes if the hot fix affected predicted quantities.)
There are two reasons for this citation. First, I need to track the use of the code so that I have something to show my funding agency. I can quickly pick up citations like the above using web-based literature searchers. Second, this citation provides a clear statement of the exact version of the code that was used so that anyone else can redo the calculations for themselves. The predictions of the code change with time due to changes in the atomic/molecular data base, improvements in the physical simulations, and bugs. With this precise statement of the sub-version, and the documentation on the Cloudy web site, it is possible for anyone to reproduce the calculation for themselves.
It's Cloudy, not CLOUDY. The tradition of capitalizing names of programs goes back to FORTRAN 77 and its predecessors. The ANSI/ISO standard required that source code be in capital letters. Both modern Fortran and C allow mixed case. It's Cloudy.
I don't see absorption lines in the continuum produced by the punch continuum commands. What happened?
No attempt is made to superimpose the thousands of absorption lines on the continuum. The one exception is Lya, which can have very broad damping wings that produce Rayleigh scattering. The lines and continuum do know about each other, although the live in separate quarters.
The basic reason for this approach is that the continuum resolution (dE/E)
used by the code is typically a percent. Lines are so much sharper that
they would not be visible at this resolution even if they were black at their
center. A typical intrinsic line width might be 10 km/s (depending on
atomic mass and temperature), while a 2 percent continuum resolution corresponds
to a cell width of 6000 km/s. So a black 10 km/s line would create only a
10/6000 depression across the continuum cell. Although it would be
possible to artificially increase the contrast, as is done with the
The preferred approach is to generate an observed spectrum ex post facto.
Have the code generate a list of all optically thick lines. Cloudy
includes all Opacity Project resonance lines, so there are tens of thousands of
possible transitions. In version 94 and before, line optical depths can be
printed by including the command
Last, write code to attenuate the continuum by absorption of these lines. If the gas brightness temperature is much smaller than the continuum brightness temperature then only absorption need be included (not reemission). Equation 3-6 of Spitzer (Physical Processes in the Interstellar Medium, 1978) shows how to do this. The optical depth at each frequency is given by equation 9-45 of Mihalas (Stellar Atmospheres, 1978).
I made the gas very hot by putting in an extra source of heat or ionization, but the ionization did not increase above a certain level. Why?
Versions 90 and before of the code only considered ionization stages that can be created by the incident continuum. If the incident continuum is very soft then only low stages of ionization will be considered. If an extra large source of heat or ionization is added then the level of ionization will only increase to the highest level of ionization possible with the soft incident radiation field. This is discussed in several places in Part I of Hazy where the work-around is also described. Simply add a very hard continuum source with a vanishingly small ionization parameter. Add the following pair of commands, which should do the trick and have no influence on the computed level of ionization.
* add a very hot brems continuum
Versions 94 and later no longer needs to have the continuum defined over all possible energies and this problem will not occur.
The code did not include lines in the output from the punch continuum command before version 90. All lines are now included, but there are several tricky points to watch out for. The basic problem is that the code knows what the intensity or luminosity of the line is, but it does not know over what frequency or wavelength range that energy is spread.
The line to continuum contrast is a linear function of the assumed line width. The intensity of a line in the continuum output file will only be equal to the predicted intensity if the line width is set to the speed of light. This is discussed in Part 1 of Hazy where the punch continuum command is described, see the subsection on line to continuum contrast. The set PunchLWidth command, described there and later in Hazy, allows you to change the line width to the appropriate value.
All lines include contributions from continuum pumping. This component may or may not be a net contributor to the observed line, depending on whether the continuum source is in the beam. The pumped part of the line is not included in the punch continuum output for this reason. Direct continuum excitation will generally be very important for clouds that are optically thin in the lines. The pumped contribution can be printed out separately, as described in Hazy I, page 100.
Acknowledgements: Thanks to Christian Knigge for help.
The emission lines now consist of several sets that represent the archeological past of the code. This link gives some of the more prominent strong lines and their identification in the code.
Each line is identified by a four character label and an integer wavelength. This strategy is outlined in Part IV, pages 612-614. Often the line label is the spectroscopic designation of the spectrum, such as H 1 or Fe19. For many lines various contributors are also indicated, and these will have different labels but the same wavelength. The integer wavelength can be in various units, depending on its order of magnitude. This is described on page 626 of Part IV.
Only the strongest lines are printed to save space, and not all contributors to lines are printed. You change the threshold for the faintest line to print with the print faint command described on page 99 of Part I. You can print various contributors to many lines with the print line options commands described on pages 99-100 of Part I.
I usually use Cloudy as a subroutine to run extensive grids. In this case the intensity of a line can be pulled out of the emission line stack by calling routine cdLine, described on pages 407-408 of Part III.
The emission line identifications represent the history of how the code was developed. This will eventually be rationalized, but this will not happen very soon. Here is a outline of the line groups and their descriptions:
The classical forbidden lines and other specific information: These were the first lines and were coded in on an individual basis. They are described on pages 627-634 of Part IV.
Optically thin recombination lines of C, N, and O. These come from the Nussbaumer and Storey and Pequignot et al. works. They are described on pages 634-635 of Part IV and all lines are listed on pages 636-637.
level 1 transferred lines. These are mostly strong UV and FIR lines. These lines have accurate collision strengths and wavelengths. The individual lines are listed on pages 638-639 of Hazy IV.
level 2 transferred lines. These lines come from Dima Verner's list of Opacity Project lines. the line wavelength are theoretical and frequently off by 10%. Collision strengths come from various g-bar approximations, which is a polite way of saying that they are a wild guess. The transition probabilities come from the Opacity Project and are quite accurate. The lines are listed on pages 639-681 of Part IV.
Hydrogen lines. These are transferred in a manner similar to the level 1 lines. Their identification begins with the label "H 1" followed by the line wavelength. If infrared lines are of interest it will be necessary to increase the number of levels in the model hydrogen atom. This is done with the hydrogen levels limit command described on page 77 of Part I..
Contributions to transferred lines. You can break out various contributions to the transferred lines with the print line options commands described on pages 99-100 of Part I. This is further described in section 3.4.2 starting on page 635 of Hazy IV.
You can produce a file with all the line labels and wavelengths
with the punch line labels command, described on page 110 of
Part I. To get a complete list of all possible lines include the following
two commands in an input file:
For regions with lower densities, say below log(n) = 8 [cm-3], you can safely assume Case B emissivity. In this case the full recombination spectrum is a simple function of the density and temperature. Various weighted mean densities and temperatures are listed in the summary after the end of the calculation. Look up faint lines relative to some line cloudy does predict using standard tables such as Storey & Hummer (MNRAS 272, 41) for hydrogen or hydrogenic ions, or Smits (MNRAS 278, 683) for helium.
It is simple to add one of these extra lines to the code using Case B emissivity. Peter Martin did this for a series of lines in subroutines addln1. It should be clear how to add more by looking at his logic.
Versions 90 and before of the code had trouble with densities below n = 10-4 cm-3 due to underflow on 32-bit processors. A typical cooling rate for temperatures near 10,000K is 10-24 n2 [erg cm-3 s-1]. The heating or cooling per unit volume will be < 10-32 erg/cm3/s for n < 10-4 cm-3. The problem will be when the code evaluates the cooling derivative for the temperature predictor/corrector - the derivative is of order Cooling/temperature or < 1e-36 erg/cm3/s/K, close to the underflow limit on a 32-bit processor. When underflow occurs the code may find that the cooling or its derivative is zero, which is interpreted as insanity. The code will stop or crash under these conditions.
One simple solution would be to compile the entire code with the compiler option set to interpret all floating point as double precision. There are two problems : The code has not been completely tested when compiled this way, and there are places where it explicitly checks on numbers smaller than roughly 10-37 and sets them to zero (to protect against gradual underflow which can cause numerical problems later). But this will probably work.
The better solution is to rescale the entire problem and work at higher densities. Photoionization models have homology relations with the ionization parameter. So long as the density is below 1000 cm-3 the density should have no direct effect on the model as long as the ionization parameter is kept constant. (Above 1000 cm-3 some coolants can be collisionally deactivated). The best idea would be to work at a density of 1 cm-3 and then scale up the flux and background by the ratio n(desired)/1 cm-3 where n(desired) is the density you want. The intensity (emission per unit area) will be too large by this ratio, but the temperature, level of ionization, and column densities will be correct.
Version 94 has explicitly fixed this problem by making the relevant parts of the code double precision. Version 94 has been verified at densities of 10-8 cm-3.
Versions 94 and before of Cloudy were designed to treat the situation where ionizing radiation is attenuated and the resulting level of ionization decreases with depth into the cloud. The highest and lowest stages of ionization that are considered are adjusted downwards as the ionization decreases, both to save time and avoid numerical instabilities in the matrix inversion routine. The lowest fractional abundance that is considered is set with the set trim command, described in Part I of Hazy.
This approach will fail if the ionization increases with depth by large amounts since the upper limit to the range of ionization is never increased. This can only happen when the gas density falls off much faster than r-2, so that the ionization parameter (ratio of photons to particles) increases. In this case the ionization will (incorrectly) bunch up at the highest stage of ionization available.
This problem can be overcome by using the set trim command to disable ionization stage trimming. A value like set trim -30 will effectively turn off this feature, at the expense of much longer execution times.
Version 96 no longer has this constraint.
Acknowledgements: Thanks to Jim Kingdon for pointing out this problem.
A low-density cloud with solar (or higher) abundances seemed to crash or stop unexpectedly. What happened?
The first thing to do in any calculation is understand why the code stopped. In most cases it stops because the electron temperature falls below 4000K, unless this is reset with the stop temperature command. For classical nebulae the gas grows this cool only in neutral regions beyond the H+ - Ho ionization front, so the effect is to stop the calculation near this ionization front.
If the gas has solar or higher abundances, and the density is 1,000 cm-3 or lower, the ionized part of the cloud can equilibrate at temperatures well below 1000K despite the high ionization and photoelectric heating. This is discussed, among other places, in Ferland et al. (1984; ApJ 281, 194, the test case dqher.in illustrates the phenomenon) and for HII regions by Shields and Kennicutt (1995, ApJ 454, 807).
This is not a problem, and should occur in real nebulae. (It is not understood why pure recombination line HII regions are not observed.) Cloudy stopped because the gas was too cool. To compute the full ionization structure you should lower the lowest electron temperature with the stop temperature command, say to something like stop temperature 100. But be aware that thermal instabilities can occur around 1000K where the cooling curve has a maximum due to fine structure cooling.