Multi-Variable Iteration

Types

Functions

LUSE_ENGR701_704_NumericalMethods.MultiVariableIterations.find_omegaFunction
find_omega(MVI[, omega=0.])

Find the optimum relaxation parameter, omega for MVI.A and MVI.x, if possible.

Notes

If 0 < omega < 2, then method will converge regardless of choice for MVI.x. If matrix, MVI.A is tridiagonal, then the spectral radius of Gauss-Seidel's T-matrix, $\mathbf{T}\_{g}$ is used to calculate $ω := 2 / (1 + \sqrt{(1 - `spectral_radius`(\mathbf{T}\_{g})})$ [burdenNumericalAnalysis2016]_.

source
LUSE_ENGR701_704_NumericalMethods.MultiVariableIterations.gauss_seidelMethod
gauss_seidel(MVI)

Solve $\vec{x} = \mathbf{A}^{-1}\vec{b}$ via the Gauss-Seidel Method.

Notes

This method improves on jacobi() by using the most recently calculated entries in the approximation vector, x at the end of each iteration. The core algorithm by which method marches through iterations:

\[ \vec{x}^{(k)} = \bigl( (\mathbf{D} - \mathbf{L})^{-1} * \mathbf{U} \bigr) ⋅ \vec{x}^{(k - 1)} + \bigl( (\mathbf{D} - \mathbf{L})^{-1} \bigr) ⋅ \vec{b}\]

source
LUSE_ENGR701_704_NumericalMethods.MultiVariableIterations.jacobiMethod
jacobi(MVI)

Solve $\vec{x} = \mathbf{A}^{-1}\vec{b}$ via the Jacobi Method to find $\vec{x}$.

Notes

The core algorithm by which method marches through iterations:

\[ \vec{x}^{(k)} = \bigl( \mathbf{D}^{-1} * (\mathbf{L} + \mathbf{U}) \bigr) ⋅ \vec{x}^{(k - 1)} + ( \mathbf{D}^{-1} ) ⋅ \vec{b}\]

source
LUSE_ENGR701_704_NumericalMethods.MultiVariableIterations.successive_relaxationFunction
successive_relaxation(MVI[, omega=0.])

Solve $\vec{x} = \mathbf{A}^{-1}\vec{b}$ via the Successive Relaxation Method. Method is Successive Over-Relaxation (SOR) if omega > 1, Successive Under-Relaxation (SUR) if omega < 1, and reduces to Gauss-Seidel if omega = 1.

Notes

SOR and SUR accelerate or deccelerate convergence of gauss_seidel(), respectively, by decreasing or increasing the spectral radius of MVI.A. The core algorithm by which method marches through iterations:

\[ \vec{x}^{(k)} = \bigl( (\mathbf{D} - ω\mathbf{L})^{-1} * ((1 - ω)*\mathbf{D} + ω\mathbf{U}) \bigr) ⋅ \vec{x}^{(k - 1)} + ω( (\mathbf{D} - ω\mathbf{L})^{-1} ) ⋅ \vec{b}\]

If left unspecified, and if possible, an optimum relaxation parameter, ω will be calculated by find_omega().

source
LUSE_ENGR701_704_NumericalMethods.newton_raphsonMethod
newton_raphson(MVI::MultiVariableIteration, variables::Tuple{Vararg{Num}}[; jacobian=nothing])

Solve non-linear systems of equations, $\vec{x} = \mathbf{A}^{-1}\vec{b}$ via the Newton-Raphson Method.

Here, MVI.A should be a vector of functions wherein each variable is represented. Method will go faster if jacobian is pre-defined. Otherwise, the Jacobian matrix of MVI.A will be internally constructed.

Examples

f1(x1, x2, x3)  = 3x1 - cos(x2*x3) - 0.5
f2(x1, x2, x3)  = x1^2 - 81(x2 + 0.1)^2 + sin(x3) + 1.06
f3(x1, x2, x3)  = exp(-x1*x2) + 20x3 + 3\(10π - 3)
A               = [f1, f2, f3]
b               = zeros(length(A))
x0              = [0.1, 0.1, -0.1]
tol             = 1e-9
MVI             = MultiVariableIteration(A, x0, b, 5, tol)
using Symbolics
@variables x1, x2, x3
newton_raphson(MVI, (x1, x2, x3))
source
LUSE_ENGR701_704_NumericalMethods.solveMethod
solve(MVI::MultiVariableIteration[; method=:jacobi, omega=0., variables, jacobian])

Solve $\vec{x} = \mathbf{A}^{-1}\vec{b}$ according to method ∈ {:jacobi (default), :gauss_seidel, :successive_relaxation, :newton_raphson}.

Each method has an equivalent convenience function. E.g. solve(MVI; method=:jacobi)jacobi(MVI).

source

Index