# Matrix functions

A **matrix** is a square array, that is an array that has two dimensions of the same length. It can also have other dimensions, not necessarily of the same length. Matrix functions perform a variety of operations of matrices and other arrays.

Standard mathematical notation for matrix algebra, and most computer languages that work with matrices, distinguish rows and columns. In Analytica, rows and columns are not basic to an array: They are just ways you can choose to display an array in a table. Instead, it uses a named index to label each dimension. So, when using a matrix function, like any Analytica function that work with arrays, you specify the dimensions you want it to operate over by naming the index(es) as a parameter(s). For example:

`Transpose(X, I, J)`

This exchanges indexes «I» and «J» for matrix «X». You don’t need to remember or worry about which dimension is the rows and which the columns. «X» can also have other indexes, and the function generalizes appropriately.

## Dot product of two matrices

The dot product (or inner product) of «MatrixA» and «MatrixB» if they are both indexed by «i» is given by:

`Sum(MatrixA * MatrixB, i)`

Unlike other computer languages that work on arrays, such as APL, which follow conventional notation for linear algebra, you don't have to transpose the first parameter. Analytica "knows" that both parameters are indexed by «i», and so does the right thing. You don't have to worry about which is the "row vector" and which the "column vector".

It also works fine if the parameters have any other indexes. The result will have the union of the indexes, but without «i». See also Sum().

**Example:**

`Variable MatrixA :=`

i ▶ j ▼ 1 2 3 a 4 1 2 b 2 5 3 c 3 2 7

`Variable MatrixB :=`

i ▶ k ▼ 1 2 3 l 3 2 1 m 2 5 3 n 4 1 2

`Sum(MatrixA*MatrixB, i) →`

j ▶ k ▼ 1 2 3 l 16 19 20 m 19 38 37 n 21 19 28

## Cross product

The cross product (also known as the outer product or tensor product) of two vectors (or more generally, arrays) with different indices is simply their product in Analytica:

`MatrixA * MatrixB`

The result has the union of the indices of the two operands. Each value or cell in the resulting array is simply the product of the elements of the two operands with the corresponding index values.

## MatrixMultiply(a, aRow, aCol, b, bRow, bCol)

Performs a matrix multiplication on matrix «a», having indexes «aRow» and «aCol», and matrix «b», having indexes «bRow» and «bCol». The result is indexed by «aRow» and «bCol». «a» and «b» must have the specified two indexes, and can also have other indexes. «aCol» and «bRow» must have the same length or it flags an error. If «aRow» and «bCol» are the same index, it returns only the diagonal of the result.

See also MatrixMultiply().

**Library:** Matrix

**Example:** Matrices

`Variable C:=`

index2 ▶ index1 ▼ 1 2 1 1 2 2 1 0

- x

`Variable D:=`

index3 ▶ index2 ▼ a b c 1 3 0 1 2 0 1 1

`MatrixMultiply(C, index1, index2, D, index2, index3) →`

index3 ▶ index1 ▼ a b c 1 3 2 3 2 3 0 1

When the inner index is shared by `C`

and `D`

, the expression `Sum(C*D, index2)`

is equivalent to their dot product.

`MatrixMultiply(A, I, J, Transpose(A, I, J), I, J)`

It does not work to use `MatrixMultiply(A, I, J, A, J, I)`

because the result would have to be doubly indexed by `I`

.

## Transpose(c, i, j)

Returns the transpose of matrix «c» exchanging dimensions «i» and «j», which must be indexes of the same size.

See also Transpose().

**Library:** Matrix

**Example:**

`Transpose(MatrixA, i, j) →`

i ▶ j ▼ 1 2 3 a 4 2 3 b 1 5 2 c 2 3 7

The conjugate transpose of a complex matrix is obtained using

`Transpose(Conj(c), i, j)`

## Identity matrix

Given two indexes of equal length, the identity matrix is obtained by the expression (@i = @j).

**Example:**

`@i = @j →`

i ▶ j ▼ 1 2 3 a 1 0 0 b 0 1 0 c 0 0 1

## Unit vector

A unit vector on index «i» is obtained by the expression Array(i,1). There is no need to differentiate between a row vector and column vector, since it is the index that determines its orientation.

**Example:**

`Array(i,1) →`

i ▶ 1 2 3 1 1 1

## Invert(c, i, j)

Returns the inversion of matrix «c» along dimensions «i» and «j».

See also Invert().

**Library:** Matrix

**Example:** Set number format to fixed point, 3 decimal digits.

`Invert(MatrixA, i, j) →`

i ▶ j ▼ 1 2 3 a 0.326 -0.034 -0.079 b -0.056 0.247 -0.090 c -0.124 -0.056 0.202

## Determinant(c, i, j)

Returns the determinant of matrix «c» along dimensions «i» and «j».

See also Determinant().

**Library:** Matrix

**Example:**

`Determinant(MatrixA, i, j) → 89`

`MatrixA`

is defined above.

## Decompose(c, i, j)

Returns the Cholesky decomposition (square root) matrix of matrix «c» along dimensions «i» and «j». Matrix «c» must be symmetric and positive-definite. (Positive-definite means that `v*C*v > 0`

, for all vectors `v`

.)

Cholesky decomposition computes a lower diagonal matrix `L`

such that `L*L' = C`

, where `L'`

is the transpose of `L`

.

See also Decompose().

**Library:** Matrix

**Example:**

`Variable Matrix :=`

M ▶ L ▼ 1 2 3 4 5 1 6 2 6 3 1 2 2 4 3 1 3 3 6 3 9 3 4 4 3 1 3 8 4 5 1 3 4 4 7

`Decompose(Matrix, L, M) →`

M ▶ L ▼ 1 2 3 4 5 1 2.4495 0 0 0 0 2 0.8165 1.8257 0 0 0 3 2.4495 0.5477 1.6432 0 0 4 1.2247 -0 -0 2.5495 0 5 0.4082 1.4606 1.3389 1.3728 1.0113

**Note:** The computed result includes values that are nearly zero (-1.2e-16 and -2.7e-16), possibly as a result of floating point round-off error. When displayed with Fixed Point number format to four digits, these display as -0.

## EigenDecomp(a: Numeric[i, j]; i, j: Index)

Computes the Eigenvalues and Eigenvectors of a square, symmetric matrix «a» indexed by «i» and «j». EigenDecomp() returns a result indexed by «j» and «.item» (where «.item» is a temporary index with the two elements `['value','vector'])`

. Each column of the result contains one Eigenvalue/ Eigenvector pair. The Eigenvalue is a number, the Eigenvector is a reference to a rows-indexed Eigenvector. If result is the result of evaluating EigenDecomp(), then the Eigenvalues are given by `result[.item='value']`

, and the Eigenvectors are given by `#result[.item='vector']`

. Each Eigenvector is indexed by «i».

Given a square matrix «a», a non-zero number (λ) is called an Eigenvalue of «a», and a non-zero vector `x`

the corresponding Eigenvector of a when:

`a x = λ x`

An *NxN* matrix does have *N* (not-necessarily unique) Eigenvalue-Eigenvector pairs. When `A`

is a symmetric matrix, the Eigenvalues and Eigenvectors are real-valued. Eigen-analysis is widely used in Engineering and statistics.

**Library:** Matrix

**Example:**

`Variable Covariance1 :=`

stock2 ▶ stock1 ▼ INTC MOT AMD INTC 30.47 13.26 18.9 MOT 13.26 16.58 14.67 AMD 18.9 14.7 17.1

`EigenDecomp(Covariance1, Stock1, Stock2) →`

stock2 ▶ .item ▼ INTC MOT AMD value 53.91 9.227 1.022 vector «ref _{1}»«ref _{2}»«ref _{3}»

`«ref`

_{1}»

stock1 ▼ INTC -0.7002 MOT -0.4625 AMD -0.5439

`«ref`

_{2}»

stock1 ▼ INTC -0.6549 MOT -0.7194 AMD -0.2314

`«ref`

_{3}»

stock1 ▼ INTC -0.2843 MOT -0.5182 AMD -0.8066

## SingularValueDecomp(a, i, j, j2)

SingularValueDecomp() (singular value decomposition) is often used with sets of equations or matrices that are singular or ill-conditioned (that is, very close to singular). It factors a matrix «a», indexed by «i» and «j», with `Size(i) >= Size(j)`

, into three matrices, «U», «W», and «V», such that:

`a = U . W . V (7)`

where `U`

and `V`

are orthogonal matrices and `W`

is a diagonal matrix. `U`

is dimensioned by «i» and «j», «W» by «j» and «j2», and `V`

by «j» and «j2». In Analytica notation:

`Variable A := Sum(Sum(U*W, J) * Transpose(V, J, J2), J2)`

The index «j2» must be the same size as «j» and is used to index the resulting `W`

and `V`

arrays.

SingularValueDecomp() returns an array of three elements indexed by a special system index named `SvdIndex`

with each element, `U, W`

, and `V`

, being a reference to the corresponding array. Use the `#`

(dereference) operator to obtain the matrix value from each reference, as in:

`Index J2 := CopyIndex(J)`

`Variable SvdResult := SingularValueDecomp(MatrixA, I, J, J2)`

`Variable U := #SvdResult[SvdIndex = 'U']`

`Variable W := #SvdResult[SvdIndex = 'W']`

`Variable V := #SvdResult[SvdIndex = 'V']`

`U →`

i ▶ j ▼ 1 2 3 a -0.4789 -0.4517 -0.7527 b -0.8562 0.05127 0.514 c 0.1936 -0.8907 0.4113

`W →`

i ▶ j ▼ 1 2 3 a 10.14 0 0 b 0 2.606 0 c 0 0 3.367

`W[j = i] →`

i ▶ 1 2 3 10.14 2.606 3.367

`V →`

i ▶ j ▼ 1 2 3 a -0.3818 -0.9001 0.2097 b -0.5398 0.03299 -0.8412 c -0.7502 0.4344 0.4984

## VectorCrossProduct(u,v,i)

(*New to Analytica 5.0*)

Computes the vector cross product [math]\displaystyle{ u \times v }[/math] in 3-dimensional space (or 7-D space). «u» and «v» are both vectors indexed by «i», and «i» must have 3 elements (or 7 elements). The result is a vector that is perpendicular to both «u» and «v», and whose length is equal to the area of the parallelogram spanned by the original vectors. When «u» and «v» are both unit vectors, the length of the resulting vector is the Sin of the angle between «u» and «v».

Note: The vector cross product is well-defined only in three-D and seven-D spaces.

Enable comment auto-refresher