IkedaCarpenterPV analytical derivative
Overview
This page records the analytical derivative of the implemented
IkedaCarpenterPV peak shape.
This derivation has been transcribed from handwritten notes produced during development, so there
may be notation error, Please flag if you find an issue.
The notation has been used for convenience with comparison to the code in
IkedaCarpenterPV.cpp as well as clarity for distinguishing between variables, so there are some differences
in the symbols used compared with IkedaCarpenterPV.
The Mantid implementation of IkedaCarpenterPV evaluates
\[f = I Q \left[(1-\eta)G - \frac{2\eta}{\pi}L\right].\]
Natural parameters and helper definitions
It is convenient to derive the Jacobian first with respect to the natural
parameters
\[(I, \alpha, \beta, R, S, H, \eta, \Delta),\]
and then chain these back to the Mantid parameters.
The wavelength-dependent moderator parameters are
\[\alpha = \frac{1}{A_0 + \lambda A_1},\]
\[\beta = \frac{1}{B_0},\]
\[R = \exp\left(-\frac{81.799}{\kappa \lambda^2}\right),\]
\[\Delta = x_i - X_0,\]
where \(A_0 = \mathrm{Alpha0}\), \(A_1 = \mathrm{Alpha1}\),
\(B_0 = \mathrm{Beta0}\), \(\kappa = \mathrm{Kappa}\),
\(x_i\) is the current x-value, and \(\lambda\) is the wavelength at
that x-value.
As in the implementation, define
\[k = 0.05,\]
\[a_u = \alpha(1-k), \qquad a_v = \alpha(1+k), \qquad a_s = \alpha, \qquad a_r = \beta,\]
\[x = a_u - \beta, \qquad y = \alpha - \beta, \qquad z = a_v - \beta.\]
The overall prefactor is
\[Q = \frac{1}{4} \alpha \frac{1-k^2}{k^2}.\]
The branch coefficients are
\[N_u = 1 - R\frac{a_u}{x},\]
\[N_v = 1 - R\frac{a_v}{z},\]
\[N_s = -2\left(1 - R\frac{\alpha}{y}\right),\]
\[N_r = 2R\alpha^2\beta\frac{k^2}{xyz}.\]
For \(j \in \{u,v,s,r\}\) define
\[q_j = \frac{1}{2} a_j(a_j S - 2\Delta),\]
\[y_j = \frac{a_j S - \Delta}{\sqrt{2S}},\]
\[z_j = -a_j\Delta + i\frac{1}{2}a_j H.\]
Now define the Gaussian-like and Lorentzian-like building blocks
\[K_j = e^{q_j} \operatorname{erfc}(y_j),\]
\[F_j = e^{z_j}E_1(z_j), \qquad J_j = \Im(F_j),\]
Where \(\Im\) denotes taking the imaginary component.
and the two sums
\[G = N_u K_u + N_v K_v + N_s K_s + N_r K_r,\]
\[L = N_u J_u + N_v J_v + N_s J_s + N_r J_r.\]
The implemented function is therefore
\[f = I Q \left[(1-\eta)G - \frac{2\eta}{\pi}L\right].\]
Basic derivatives of the full function
Let
\[M = (1-\eta)G - \frac{2\eta}{\pi}L,\]
so that \(f = IQM\).
The derivative with respect to intensity is immediate:
\[\frac{\partial f}{\partial I} = QM.\]
For \(p = \eta\), the prefactor \(Q\) does not depend on
\(\eta\), so
\[\frac{\partial f}{\partial \eta}
= IQ\frac{\partial M}{\partial \eta}
= -IQ\left(G + \frac{2}{\pi}L\right).\]
For any other natural parameter \(p\),
\[\frac{\partial f}{\partial p}
= I\left[\frac{\partial Q}{\partial p} M + Q\frac{\partial M}{\partial p}\right],\]
with
\[\frac{\partial M}{\partial p}
= (1-\eta)\frac{\partial G}{\partial p}
- \frac{2\eta}{\pi}\frac{\partial L}{\partial p}.\]
Since
\[Q = \frac{1}{4} \alpha \frac{1-k^2}{k^2},\]
it follows that
\[\begin{split}\frac{\partial Q}{\partial p}
= \begin{cases}
\dfrac{Q}{\alpha}, & p = \alpha, \\
0, & p \neq \alpha.
\end{cases}\end{split}\]
Derivative of \(G\)
The Gaussian-like sum is
\[G = \sum_{j \in \{u,v,s,r\}} N_j K_j,\]
so by the product rule
\[\frac{\partial G}{\partial p}
= \sum_{j \in \{u,v,s,r\}}
\left[ \frac{\partial N_j}{\partial p}K_j + N_j\frac{\partial K_j}{\partial p} \right].\]
The derivative of \(K_j\)
Start from
\[K_j = e^{q_j}\operatorname{erfc}(y_j).\]
Differentiate directly:
\[\frac{\partial K_j}{\partial p}
= \frac{\partial e^{q_j}}{\partial p}\operatorname{erfc}(y_j)
+ e^{q_j}\frac{\partial\operatorname{erfc}(y_j)}{\partial p}.\]
Using
\[\frac{\partial e^{q_j}}{\partial p} = \frac{\partial q_j}{\partial p}e^{q_j},\]
and
\[\frac{d}{dy}\operatorname{erfc}(y) = -\frac{2}{\sqrt{\pi}}e^{-y^2},\]
we obtain
\[\frac{\partial K_j}{\partial p}
= \frac{\partial q_j}{\partial p}K_j
- \frac{2}{\sqrt{\pi}} e^{q_j-y_j^2} \frac{\partial y_j}{\partial p}.\]
Now simplify \(q_j - y_j^2\):
\[q_j = \frac{a_j^2S - 2a_j\Delta}{2},\]
\[y_j^2 = \frac{(a_jS - \Delta)^2}{2S}
= \frac{a_j^2S^2 - 2a_j\Delta S + \Delta^2}{2S}
= \frac{a_j^2S - 2a_j\Delta + \Delta^2/S}{2},\]
so
\[q_j - y_j^2 = -\frac{\Delta^2}{2S}.\]
Therefore the exponential prefactor is independent of \(j\):
\[e^{q_j-y_j^2} = e^{-\Delta^2/(2S)}.\]
Define the common factor
\[C_0 = \frac{2}{\sqrt{\pi}} e^{-\Delta^2/(2S)}.\]
Then
\[\frac{\partial K_j}{\partial p}
= \frac{\partial q_j}{\partial p}K_j - C_0\frac{\partial y_j}{\partial p}.\]
Useful special cases are derived below.
Derivative of \(K_j\) with respect to \(a_j\)
From
\[q_j = \frac{1}{2}a_j^2S - \Delta a_j,\]
we have
\[\frac{\partial q_j}{\partial a_j} = a_j S - \Delta.\]
Also,
\[y_j = \frac{a_jS - \Delta}{\sqrt{2S}},\]
so
\[\frac{\partial y_j}{\partial a_j} = \frac{S}{\sqrt{2S}} = \sqrt{\frac{S}{2}}.\]
Hence
\[\frac{\partial K_j}{\partial a_j}
= (a_jS - \Delta)K_j - C_0\sqrt{\frac{S}{2}}.\]
Derivative of \(K_j\) with respect to \(S\)
Since
\[q_j = \frac{1}{2}a_j(a_jS - 2\Delta),\]
we have
\[\frac{\partial q_j}{\partial S} = \frac{1}{2}a_j^2.\]
For \(y_j\), write
\[y_j = \frac{a_jS - \Delta}{\sqrt{2S}}.\]
Differentiating gives
\[\frac{\partial y_j}{\partial S}
= \frac{a_j}{\sqrt{2S}} + (a_jS - \Delta)\frac{\partial}{\partial S}(2S)^{-1/2}
= \frac{a_j}{\sqrt{2S}} - \frac{a_jS - \Delta}{(2S)^{3/2}}.\]
This simplifies to
\[\frac{\partial y_j}{\partial S}
= \frac{a_jS + \Delta}{2S\sqrt{2S}}.\]
Therefore
\[\frac{\partial K_j}{\partial S}
= \frac{1}{2}a_j^2 K_j - C_0\left[\frac{a_jS + \Delta}{2S\sqrt{2S}}\right].\]
Derivative of \(K_j\) with respect to \(\Delta\)
From the same definitions,
\[\frac{\partial q_j}{\partial \Delta} = -a_j,\]
\[\frac{\partial y_j}{\partial \Delta} = -\frac{1}{\sqrt{2S}}.\]
Hence
\[\frac{\partial K_j}{\partial \Delta}
= -a_jK_j + \frac{C_0}{\sqrt{2S}}.\]
Derivative of \(L\)
The Lorentzian-like sum is
\[L = \sum_{j \in \{u,v,s,r\}} N_j J_j,\]
so similarly
\[\frac{\partial L}{\partial p}
= \sum_{j \in \{u,v,s,r\}}
\left[ \frac{\partial N_j}{\partial p}J_j + N_j\frac{\partial J_j}{\partial p} \right].\]
The derivative of \(J_j\)
Recall that
\[J_j = \Im(F_j), \qquad F_j = e^{z_j}E_1(z_j).\]
Use the identity
\[\frac{d}{dz}\left(e^z E_1(z)\right) = e^zE_1(z) - \frac{1}{z},\]
so
\[\frac{\partial F_j}{\partial p}
= \left(F_j - \frac{1}{z_j}\right)\frac{\partial z_j}{\partial p}.\]
Taking imaginary parts gives
\[\frac{\partial J_j}{\partial p}
= \Im\left[\left(F_j - \frac{1}{z_j}\right)\frac{\partial z_j}{\partial p}\right].\]
Useful special cases are obtained from
\[z_j = -a_j\Delta + i\frac{1}{2}a_jH.\]
Derivative of \(J_j\) with respect to \(a_j\)
Since
\[\frac{\partial z_j}{\partial a_j} = -\Delta + i\frac{H}{2},\]
we have
\[\frac{\partial J_j}{\partial a_j}
= \Im\left[\left(F_j - \frac{1}{z_j}\right)\left(-\Delta + i\frac{H}{2}\right)\right].\]
Derivative of \(J_j\) with respect to \(H\)
Now
\[\frac{\partial z_j}{\partial H} = i\frac{a_j}{2},\]
so
\[\frac{\partial J_j}{\partial H}
= \Im\left[\left(F_j - \frac{1}{z_j}\right)i\frac{a_j}{2}\right]
= \frac{a_j}{2}\Re\left(F_j - \frac{1}{z_j}\right).\]
Where \(\Re\) denotes taking the real component
Derivative of \(J_j\) with respect to \(\Delta\)
Finally,
\[\frac{\partial z_j}{\partial \Delta} = -a_j,\]
hence
\[\frac{\partial J_j}{\partial \Delta}
= -a_j\Im\left(F_j - \frac{1}{z_j}\right).\]
Derivatives of the branch coefficients
Derivatives with respect to \(R\)
These follow immediately:
\[\frac{\partial N_u}{\partial R} = -\frac{a_u}{x},\]
\[\frac{\partial N_v}{\partial R} = -\frac{a_v}{z},\]
\[\frac{\partial N_s}{\partial R} = 2\frac{\alpha}{y},\]
\[\frac{\partial N_r}{\partial R} = 2\alpha^2\beta\frac{k^2}{xyz}.\]
Derivatives with respect to \(\alpha\)
For \(N_u\), write
\[N_u = 1 - R\frac{a_u}{x}, \qquad a_u = \alpha(1-k), \qquad x = a_u - \beta.\]
Then
\[\frac{\partial N_u}{\partial \alpha}
= -R\frac{\partial}{\partial \alpha}\left(\frac{a_u}{x}\right)
= -R\frac{(\partial a_u/\partial \alpha)x - a_u(\partial x/\partial \alpha)}{x^2}.\]
Since
\[\frac{\partial a_u}{\partial \alpha} = 1-k, \qquad \frac{\partial x}{\partial \alpha} = 1-k,\]
we obtain
\[\frac{\partial N_u}{\partial \alpha}
= -R(1-k)\frac{x-a_u}{x^2}
= R(1-k)\frac{\beta}{x^2}.\]
Similarly,
\[\frac{\partial N_v}{\partial \alpha} = R(1+k)\frac{\beta}{z^2}.\]
For \(N_s = -2(1 - R\alpha/y)\),
\[\frac{\partial N_s}{\partial \alpha}
= 2R\frac{\partial}{\partial \alpha}\left(\frac{\alpha}{y}\right)
= 2R\frac{y-\alpha}{y^2}
= -2R\frac{\beta}{y^2}.\]
For \(N_r = 2R\alpha^2\beta k^2/(xyz)\), use the logarithmic derivative:
\[\frac{1}{N_r}\frac{\partial N_r}{\partial \alpha}
= \frac{2}{\alpha} - \frac{1}{x}\frac{\partial x}{\partial \alpha}
- \frac{1}{y}\frac{\partial y}{\partial \alpha}
- \frac{1}{z}\frac{\partial z}{\partial \alpha}.\]
Since
\[\frac{\partial x}{\partial \alpha} = 1-k, \qquad
\frac{\partial y}{\partial \alpha} = 1, \qquad
\frac{\partial z}{\partial \alpha} = 1+k,\]
this becomes
\[\frac{\partial N_r}{\partial \alpha}
= N_r\left[\frac{2}{\alpha} - \frac{1-k}{x} - \frac{1}{y} - \frac{1+k}{z}\right].\]
Derivatives with respect to \(\beta\)
For \(N_u = 1 - Ra_u/x\) and \(\partial x/\partial \beta = -1\),
\[\frac{\partial N_u}{\partial \beta}
= -\frac{Ra_u}{x^2}.\]
Similarly,
\[\frac{\partial N_v}{\partial \beta} = -\frac{Ra_v}{z^2},\]
\[\frac{\partial N_s}{\partial \beta} = 2R\frac{\alpha}{y^2}.\]
For \(N_r\), another logarithmic derivative gives
\[\frac{1}{N_r}\frac{\partial N_r}{\partial \beta}
= \frac{1}{\beta} - \frac{1}{x}\frac{\partial x}{\partial \beta}
- \frac{1}{y}\frac{\partial y}{\partial \beta}
- \frac{1}{z}\frac{\partial z}{\partial \beta}.\]
Because
\[\frac{\partial x}{\partial \beta} = \frac{\partial y}{\partial \beta} = \frac{\partial z}{\partial \beta} = -1,\]
we obtain
\[\frac{\partial N_r}{\partial \beta}
= N_r\left[\frac{1}{\beta} + \frac{1}{x} + \frac{1}{y} + \frac{1}{z}\right].\]
Using \(a_{j,p}\) to lift branch derivatives to \(\alpha\) and \(\beta\)
The derivatives above for \(K_j\) and \(J_j\) were first written with
respect to the branch variable \(a_j\). These are converted to derivatives
with respect to \(\alpha\) and \(\beta\) by the chain rule.
Only the following branch derivatives are non-zero:
\[\frac{\partial a_u}{\partial \alpha} = 1-k, \qquad
\frac{\partial a_v}{\partial \alpha} = 1+k, \qquad
\frac{\partial a_s}{\partial \alpha} = 1, \qquad
\frac{\partial a_r}{\partial \alpha} = 0,\]
and
\[\frac{\partial a_u}{\partial \beta} = 0, \qquad
\frac{\partial a_v}{\partial \beta} = 0, \qquad
\frac{\partial a_s}{\partial \beta} = 0, \qquad
\frac{\partial a_r}{\partial \beta} = 1.\]
Therefore
\[\frac{\partial K_u}{\partial \alpha} = \frac{\partial K_u}{\partial a_u}(1-k),
\qquad
\frac{\partial K_v}{\partial \alpha} = \frac{\partial K_v}{\partial a_v}(1+k),
\qquad
\frac{\partial K_s}{\partial \alpha} = \frac{\partial K_s}{\partial a_s},
\qquad
\frac{\partial K_r}{\partial \alpha} = 0,\]
and similarly
\[\frac{\partial K_r}{\partial \beta} = \frac{\partial K_r}{\partial a_r},\]
with the other \(\partial K_j/\partial \beta\) terms equal to zero.
Exactly the same pattern applies to \(J_j\).
Collecting the natural-parameter derivatives
With the pieces above, the natural-parameter derivatives are
\[\frac{\partial G}{\partial p}
= \sum_{j \in \{u,v,s,r\}}
\left[ \frac{\partial N_j}{\partial p}K_j + N_j\frac{\partial K_j}{\partial p} \right],\]
\[\frac{\partial L}{\partial p}
= \sum_{j \in \{u,v,s,r\}}
\left[ \frac{\partial N_j}{\partial p}J_j + N_j\frac{\partial J_j}{\partial p} \right],\]
\[\frac{\partial M}{\partial p}
= (1-\eta)\frac{\partial G}{\partial p}
- \frac{2\eta}{\pi}\frac{\partial L}{\partial p},\]
\[\frac{\partial f}{\partial p}
= I\left[\frac{\partial Q}{\partial p}M + Q\frac{\partial M}{\partial p}\right],\]
for \(p \neq \eta\), together with
\[\frac{\partial f}{\partial \eta}
= -IQ\left(G + \frac{2}{\pi}L\right).\]
Chain rule back to Mantid parameters
The derivatives with respect to the natural parameters now need to be mapped
back to the parameters exposed by Mantid.
Alpha0 and Alpha1
Since
\[\alpha = \frac{1}{A_0 + \lambda A_1},\]
we have
\[\frac{\partial \alpha}{\partial A_0} = -\alpha^2,\]
\[\frac{\partial \alpha}{\partial A_1} = -\lambda\alpha^2.\]
Therefore
\[\frac{\partial f}{\partial A_0} = -\alpha^2\frac{\partial f}{\partial \alpha},\]
\[\frac{\partial f}{\partial A_1} = -\lambda\alpha^2\frac{\partial f}{\partial \alpha}.\]
Beta0
Because
\[\beta = \frac{1}{B_0},\]
it follows that
\[\frac{\partial \beta}{\partial B_0} = -\beta^2,\]
so
\[\frac{\partial f}{\partial B_0} = -\beta^2\frac{\partial f}{\partial \beta}.\]
Kappa
With
\[R = \exp\left(-\frac{81.799}{\kappa\lambda^2}\right),\]
we obtain
\[\frac{\partial R}{\partial \kappa}
= \frac{81.799}{\kappa^2\lambda^2}R.\]
Hence
\[\frac{\partial f}{\partial \kappa}
= \frac{81.799}{\kappa^2\lambda^2}R\frac{\partial f}{\partial R}.\]
X0
Since
\[\Delta = x_i - X_0,\]
we have
\[\frac{\partial \Delta}{\partial X_0} = -1,\]
and therefore
\[\frac{\partial f}{\partial X_0} = -\frac{\partial f}{\partial \Delta}.\]
SigmaSquared and Gamma
The final two Mantid parameters are \(V = \mathrm{SigmaSquared}\) and
\(\Gamma_v = \mathrm{Gamma}\). The code first converts these Voigt
parameters into pseudo-Voigt parameters \(H\) and \(\eta\).
Define
\[g = \sqrt{8\ln 2\, V},\]
\[P = g^5 + 2.69269 g^4\Gamma_v + 2.42843 g^3\Gamma_v^2
+ 4.47163 g^2\Gamma_v^3 + 0.07842 g\Gamma_v^4 + \Gamma_v^5,\]
\[H = P^{1/5},\]
\[t = \frac{\Gamma_v}{H},\]
\[\eta = 1.36603 t - 0.47719 t^2 + 0.11116 t^3,\]
\[S = \frac{H^2}{8\ln 2}.\]
The derivative of \(H\) with respect to \(P\) is
\[\frac{\partial H}{\partial P}
= \frac{1}{5}P^{-4/5}
= \frac{1}{5H^4}.\]
Therefore
\[\frac{\partial H}{\partial V}
= \frac{1}{5H^4}\frac{\partial P}{\partial V}
= \frac{1}{5H^4}\frac{\partial P}{\partial g}\frac{\partial g}{\partial V},\]
and since
\[\frac{\partial g}{\partial V}
= \frac{4\ln 2}{g},\]
we have
\[\frac{\partial H}{\partial V}
= \frac{1}{5H^4}\frac{\partial P}{\partial g}\frac{4\ln 2}{g}.\]
Likewise,
\[\frac{\partial H}{\partial \Gamma_v}
= \frac{1}{5H^4}\frac{\partial P}{\partial \Gamma_v}.\]
The explicit polynomial derivatives are
\[\frac{\partial P}{\partial g}
= 5g^4 + 4(2.69269)g^3\Gamma_v + 3(2.42843)g^2\Gamma_v^2
+ 2(4.47163)g\Gamma_v^3 + 0.07842\Gamma_v^4,\]
\[\frac{\partial P}{\partial \Gamma_v}
= 2.69269 g^4 + 2(2.42843)g^3\Gamma_v + 3(4.47163)g^2\Gamma_v^2
+ 4(0.07842)g\Gamma_v^3 + 5\Gamma_v^4.\]
Next,
\[\frac{\partial S}{\partial H} = \frac{H}{4\ln 2},\]
so
\[\frac{\partial S}{\partial V}
= \frac{H}{4\ln 2}\frac{\partial H}{\partial V},\]
\[\frac{\partial S}{\partial \Gamma_v}
= \frac{H}{4\ln 2}\frac{\partial H}{\partial \Gamma_v}.\]
For \(\eta\), use
\[\frac{\partial \eta}{\partial t}
= 1.36603 - 0.95438 t + 0.33348 t^2.\]
Since \(t = \Gamma_v/H\),
\[\frac{\partial t}{\partial V}
= -\frac{\Gamma_v}{H^2}\frac{\partial H}{\partial V},\]
\[\frac{\partial t}{\partial \Gamma_v}
= \frac{1}{H} - \frac{\Gamma_v}{H^2}\frac{\partial H}{\partial \Gamma_v}.\]
Hence
\[\frac{\partial \eta}{\partial V}
= \frac{\partial \eta}{\partial t}\frac{\partial t}{\partial V},\]
\[\frac{\partial \eta}{\partial \Gamma_v}
= \frac{\partial \eta}{\partial t}\frac{\partial t}{\partial \Gamma_v}.\]
Finally, the Mantid-parameter derivatives are
\[\frac{\partial f}{\partial V}
= \frac{\partial f}{\partial S}\frac{\partial S}{\partial V}
+ \frac{\partial f}{\partial H}\frac{\partial H}{\partial V}
+ \frac{\partial f}{\partial \eta}\frac{\partial \eta}{\partial V},\]
\[\frac{\partial f}{\partial \Gamma_v}
= \frac{\partial f}{\partial S}\frac{\partial S}{\partial \Gamma_v}
+ \frac{\partial f}{\partial H}\frac{\partial H}{\partial \Gamma_v}
+ \frac{\partial f}{\partial \eta}\frac{\partial \eta}{\partial \Gamma_v}.\]
Summary
The analytical derivative is most naturally implemented in two stages:
derive \(\partial f / \partial p\) for the natural parameters
\((I, \alpha, \beta, R, S, H, \eta, \Delta)\);
use the chain rule to map these to the Mantid parameters
\((I, \mathrm{Alpha0}, \mathrm{Alpha1}, \mathrm{Beta0}, \mathrm{Kappa},
\mathrm{SigmaSquared}, \mathrm{Gamma}, X_0)\).