We present a generalization of the alternating directions implicit (ADI)
iteration to higher dimensional problems. We solve equations of the form
( I ⊗ ... ⊗ I ⊗ A1 + I ⊗ ...
⊗ I ⊗ A2 ⊗ I + ... + Ad ⊗ I
⊗ ... ⊗ I ) vec(X) = vec(B),
with B given in the tensor train format. The solution X is
computed in the tensor train format, too. The accuracy of X depends
exponentially on the local rank of X and on the rank of B. To prove this
we adapt a result for right hand sides of low Kronecker rank to low
tensor train rank. Further we give a convergence proof for the generalized
ADI iteration in the single shift case and show first ideas for more
sophisticated shift strategies. The conditioning of tensor-structured
equations is investigated by generalizing results for the matrix equations
case. Finally we present first numerical results.