- Source: Gamma-ray burst emission mechanisms
Gamma-ray burst emission mechanisms are theories that explain how the energy from a gamma-ray burst progenitor (regardless of the actual nature of the progenitor) is turned into radiation. These mechanisms are a major topic of research as of 2007. Neither the light curves nor the early-time spectra of GRBs show resemblance to the radiation emitted by any familiar physical process.
Compactness problem
It has been known for many years that ejection of matter at relativistic velocities (velocities very close to the speed of light) is a necessary requirement for producing the emission in a gamma-ray burst. GRBs vary on such short timescales (as short as milliseconds) that the size of the emitting region must be very small, or else the time delay due to the finite speed of light would "smear" the emission out in time, wiping out any short-timescale behavior. At the energies involved in a typical GRB, so much energy crammed into such a small space would make the system opaque to photon-photon pair production, making the burst far less luminous and also giving it a very different spectrum from what is observed. However, if the emitting system is moving towards Earth at relativistic velocities, the burst is compressed in time (as seen by an Earth observer, due to the relativistic Doppler effect) and the emitting region inferred from the finite speed of light becomes much smaller than the true size of the GRB (see relativistic beaming).
GRBs and internal shocks
A related constraint is imposed by the relative timescales seen in some bursts between the short-timescale variability and the total length of the GRB. Often this variability timescale is far shorter than the total burst length. For example, in bursts as long as 100 seconds, the majority of the energy can be released in short episodes less than 1 second long. If the GRB were due to matter moving towards Earth (as the relativistic motion argument enforces), it is hard to understand why it would release its energy in such brief interludes. The generally accepted explanation for this is that these bursts involve the collision of multiple shells traveling at slightly different velocities; so-called "internal shocks". The collision of two thin shells flash-heats the matter, converting enormous amounts of kinetic energy into the
random motion of particles, greatly amplifying the energy release due to all emission mechanisms. Which physical mechanisms are at play in producing the observed photons is still an area of debate, but the most likely candidates appear to be synchrotron radiation and inverse Compton scattering.
As of 2007 there is no theory that has successfully described the spectrum of all gamma-ray bursts (though some theories work for a subset). However, the so-called Band function (named after David Band) has been fairly successful at fitting, empirically, the spectra of most gamma-ray bursts:
N
(
E
)
=
{
E
α
exp
(
−
E
E
0
)
,
if
E
≤
(
α
−
β
)
E
0
[
(
α
−
β
)
E
0
]
(
α
−
β
)
E
β
exp
(
β
−
α
)
,
if
E
>
(
α
−
β
)
E
0
{\displaystyle N(E)={\begin{cases}{E^{\alpha }\exp \left({-{\frac {E}{E_{0}}}}\right)},&{\mbox{if }}E\leq (\alpha -\beta )E_{0}{\mbox{ }}\\{\left[{\left({\alpha -\beta }\right)E_{0}}\right]^{\left({\alpha -\beta }\right)}E^{\beta }\exp \left({\beta -\alpha }\right)},&{\mbox{if }}E>(\alpha -\beta )E_{0}{\mbox{ }}\end{cases}}}
A few gamma-ray bursts have shown evidence for an additional, delayed emission component at very high energies (GeV and higher). One theory for this emission invokes inverse Compton scattering. If a GRB progenitor, such as a Wolf-Rayet star, were to explode within a stellar cluster, the resulting shock wave could generate gamma-rays by scattering photons from neighboring stars. About 30% of known galactic Wolf-Rayet stars, are located in dense clusters of O stars with intense ultraviolet radiation fields, and the collapsar model suggests that WR stars are likely GRB progenitors. Therefore, a substantial fraction of GRBs are expected to occur in such clusters. As the relativistic matter ejected from an explosion slows and interacts with ultraviolet-wavelength photons, some photons gain energy, generating gamma-rays.
Afterglows and external shocks
The GRB itself is very rapid, lasting from less than a second up to a few minutes at most. Once it disappears, it leaves behind a counterpart at longer wavelengths (X-ray, UV, optical, infrared, and radio) known as the afterglow that generally remains detectable for days or longer.
In contrast to the GRB emission, the afterglow emission is not believed to be dominated by internal shocks. In general, all the ejected matter has by this time coalesced into a single shell traveling outward into the interstellar medium (or possibly the stellar wind) around the star. At the front of this shell of matter is a shock wave referred to as the "external shock" as the still relativistically moving matter ploughs into the tenuous interstellar gas or the gas surrounding the star.
As the interstellar matter moves across the shock, it is immediately heated to extreme temperatures. (How this happens is still poorly understood as of 2007, since the particle density across the shock wave is too low to create a shock wave comparable to those familiar in dense terrestrial environments – the topic of "collisionless shocks" is still largely hypothesis but seems to accurately describe a number of astrophysical situations. Magnetic fields are probably critically involved.) These particles, now relativistically moving, encounter a strong local magnetic field and are accelerated perpendicular to the
magnetic field, causing them to radiate their energy via synchrotron radiation.
Synchrotron radiation is well understood, and the afterglow spectrum has been modeled fairly successfully using this template. It is generally dominated by electrons (which move and therefore radiate much faster than protons and other particles) so radiation from other particles is generally ignored.
In general, the GRB assumes the form of a power-law with three break points (and therefore four different power-law segments.) The lowest break point,
ν
a
{\displaystyle \nu _{a}}
, corresponds to the frequency below which the GRB is opaque to radiation and so the spectrum attains the form Rayleigh-Jeans tail of blackbody radiation. The two other break points,
ν
m
{\displaystyle \nu _{m}}
and
ν
c
{\displaystyle \nu _{c}}
, are related to the minimum energy acquired by an electron after it crosses the shock wave and the time it takes an electron to radiate most of its energy, respectively. Depending on which of these two frequencies is higher, two different regimes are possible:
Fast cooling (
ν
m
>
ν
c
{\displaystyle \nu _{m}>\nu _{c}}
) - Shortly after the GRB, the shock wave imparts immense energy to the electrons and the minimum electron Lorentz factor is very high. In this case, the spectrum looks like:
F
ν
∝
{
ν
2
,
ν
<
ν
a
ν
1
/
3
,
ν
a
<
ν
<
ν
c
ν
−
1
/
2
,
ν
c
<
ν
<
ν
m
ν
−
p
/
2
,
ν
m
<
ν
{\displaystyle F_{\nu }\propto {\begin{cases}{\nu ^{2}},&\nu <\nu _{a}\\{\nu ^{1/3}},&\nu _{a}<\nu <\nu _{c}\\{\nu ^{-1/2}},&\nu _{c}<\nu <\nu _{m}\\{\nu ^{-p/2}},&\nu _{m}<\nu \end{cases}}}
Slow cooling (
ν
m
<
ν
c
{\displaystyle \nu _{m}<\nu _{c}}
) – Later after the GRB, the shock wave has slowed down and the minimum electron Lorentz factor is much lower.:
F
ν
∝
{
ν
2
,
ν
<
ν
a
ν
1
/
3
,
ν
a
<
ν
<
ν
m
ν
−
(
p
−
1
)
/
2
,
ν
m
<
ν
<
ν
c
ν
−
p
/
2
,
ν
c
<
ν
{\displaystyle F_{\nu }\propto {\begin{cases}{\nu ^{2}},&\nu <\nu _{a}\\{\nu ^{1/3}},&\nu _{a}<\nu <\nu _{m}\\{\nu ^{-(p-1)/2}},&\nu _{m}<\nu <\nu _{c}\\{\nu ^{-p/2}},&\nu _{c}<\nu \end{cases}}}
The afterglow changes with time. It must fade, obviously, but the spectrum changes as well. For the simplest case of adiabatic expansion into a uniform-density medium, the critical parameters evolve as:
ν
c
∝
t
1
/
2
{\displaystyle \nu _{c}\propto t^{1/2}}
ν
m
∝
t
−
3
/
2
{\displaystyle \nu _{m}\propto t^{-3/2}}
F
ν
,
m
a
x
=
c
o
n
s
t
{\displaystyle F_{\nu ,max}=const}
Here
F
ν
,
m
a
x
{\displaystyle F_{\nu ,max}}
is the flux at the current peak frequency of the GRB spectrum. (During fast-cooling this is at
ν
c
{\displaystyle \nu _{c}}
; during slow-cooling it is at
ν
m
{\displaystyle \nu _{m}}
.) Note that because
ν
m
{\displaystyle \nu _{m}}
drops faster than
ν
c
{\displaystyle \nu _{c}}
, the system eventually switches from fast-cooling to slow-cooling.
Different scalings are derived for radiative evolution and for a non-constant-density environment (such as a stellar wind), but share the general power-law behavior observed in this case.
Several other known effects can modify the evolution of the afterglow:
Reverse shocks and the optical flash
There can be "reverse shocks", which propagate back into the shocked matter once it begins to encounter the interstellar medium. The twice-shocked material can produce a bright optical/UV flash, which has been seen in a few GRBs, though it appears not to be a common phenomenon.
Refreshed shocks and late-time flares
There can be "refreshed" shocks if the central engine continues to release fast-moving matter in small amounts even out to late times, these new shocks will catch up with the external shock to produce something like a late-time internal shock. This explanation has been invoked to explain the frequent flares seen in X-rays and at other wavelengths in many bursts, though some theorists are uncomfortable with the apparent demand that the progenitor (which one would think would be destroyed by the GRB) remains active for very long.
Jet effects
Gamma-ray burst emission is believed to be released in jets, not spherical shells. Initially the two scenarios are equivalent: the center of the jet is not "aware" of the jet edge, and due to relativistic beaming we only see a small fraction of the jet. However, as the jet slows down, two things eventually occur (each at about the same time): First, information from the edge of the jet that there is no pressure to the side propagates to its center, and the jet matter can spread laterally. Second, relativistic beaming effects subside, and once Earth observers see the entire jet the widening of the relativistic beam is no longer compensated by the fact that we see a larger emitting region. Once these effects appear the jet fades very rapidly, an effect that is visible as a power-law "break" in the afterglow light curve. This is the so-called "jet break" that has been seen in some events and is often cited as evidence for the consensus view of GRBs as jets. Many GRB afterglows do not display jet breaks, especially in the X-ray, but they are more common in the optical light curves. Though as jet breaks generally occur at very late times (~1 day or more) when the afterglow is quite faint, and often undetectable, this is not necessarily surprising.
Dust extinction and hydrogen absorption
There may be dust along the line of sight from the GRB to Earth, both in the host galaxy and in the Milky Way. If so, the light will be attenuated and reddened and an afterglow spectrum may look very different from that modeled.
At very high frequencies (far-ultraviolet and X-ray) interstellar hydrogen gas becomes a significant absorber. In particular, a photon with a wavelength of less than 91 nanometers is energetic enough to completely ionize neutral hydrogen and is absorbed with almost 100% probability even through relatively thin gas clouds. (At much shorter wavelengths the probability of absorption begins to drop again, which is why X-ray afterglows are still detectable.) As a result, observed spectra of very high-redshift GRBs often drop to zero at wavelengths less than that of where this hydrogen ionization threshold (known as the Lyman break) would be in the GRB host's reference frame. Other, less dramatic hydrogen absorption features are also commonly seen in high-z GRBs, such as the Lyman alpha forest.