On the Computation of Complex-valued Gradients with Application to Statistically Optimum Beamforming

Christoph Boeddeker, Patrick Hanebrink, Lukas Drude, Jahn Heymann, Reinhold Haeb-Umbach
2017
9 references

Abstract

This report describes the computation of gradients by algorithmic differentiation for statistically optimum beamforming operations. Especially the derivation of complex-valued functions is a key component of this approach. Therefore the real-valued algorithmic differentiation is extended via the complex-valued chain rule. In addition to the basic mathematic operations the derivative of the eigenvalue problem with complex-valued eigenvectors is one of the key results of this report. The potential of this approach is shown with experimental results on the CHiME-3 challenge database. There, the beamforming task is used as a front-end for an ASR system. With the developed derivatives a joint optimization of a speech enhancement and speech recognition system w.r.t. the recognition optimization criterion is possible.

2 repositories
5 references

Code References

â–¶ pytorch/pytorch
3 files
â–¶ docs/source/notes/gradcheck.rst
1
Consider the elementary case where :math:`N = M = 1` first. We know from (chapter 3 of) `this research paper <https://arxiv.org/pdf/1701.00392.pdf>`_ that:
â–¶ torch/autograd/gradcheck.py
1
# Section 3.5.3 https://arxiv.org/pdf/1701.00392.pdf
â–¶ torch/csrc/autograd/FunctionsManual.cpp
2
// https://arxiv.org/pdf/1701.00392.pdf Eq 4.77
// see also https://arxiv.org/pdf/1701.00392.pdf Eqs. (4.60) and (4.63)
â–¶ tensorflow/tensorflow
1 file
â–¶ tensorflow/python/ops/linalg_grad.py
1
https://arxiv.org/abs/1701.00392
Link copied to clipboard!