Linear Algebra
Smile Shell provides an MATLAB like environment. In the simplest case, you can use it as a calculator.
    smile> "Hello, World"
    res0: String = Hello, World
    smile> 2
    res1: Int = 2
    smile> 2+3
    res2: Int = 5
    
            
    smile> "Hello, World"
    $9 ==> "Hello, World"
    smile> 2
    $10 ==> 2
    smile> 2+3
    $11 ==> 5
          
            Math Functions
Besides java.lang.Math functions, smile.math.MathEx
        provides many other important mathematical functions such as
        factorial, choose, etc.
    smile> choose(10, 3)
    res8: Double = 120.0
    
            
    smile> import static smile.math.MathEx.*
    smile> choose(10, 3)
    $14 ==> 120.0
          
            Special Functions
Special mathematical functions include beta,
        erf, gamma and their related functions. Special
        functions are particular mathematical functions which have more or less
        established names and notations due to their importance in mathematical
        analysis, functional analysis, physics, or other applications.
        Many special functions appear as solutions of differential equations or
        integrals of elementary functions. For example, the error function
        erf (also called the Gauss error function) is a special
        function of sigmoid shape which occurs in probability, statistics, materials
        science, and partial differential equations. The complementary error function,
        denoted erfc, is defined as erfc(x) = 1 - erf(x).
        The error function and complementary error function are special cases of the
        incomplete gamma function.
    smile> erf(1.0)
    res0: Double = 0.8427007929497149
    smile> digamma(1.0)
    res11: Double = -0.5772156649015328
    
            
    smile> import smile.math.special.*
    smile> Erf.erf(1.0)
    $16 ==> 0.8427007929497149
    smile> Gamma.digamma(1.0)
    $17 ==> -0.5772156649015328
          
            Vector Operations
Common arithmetic operations on vectors and scalars are similar as in R and Matlab.
    smile> val x = c(1.0, 2.0, 3.0, 4.0)
    smile> val y = c(4.0, 3.0, 2.0, 1.0)
    smile> x + y
    res22: smile.math.VectorAddVector = Array(5.0, 5.0, 5.0, 5.0)
    smile> 1.5 * x - 3.0 * y
    res24: smile.math.VectorSubVector = Array(-10.5, -6.0, -1.5, 3.0)
    
            
    smile> double[] x = {1.0, 2.0, 3.0, 4.0}
    x ==> double[4] { 1.0, 2.0, 3.0, 4.0 }
    smile> double[] y = {4.0, 3.0, 2.0, 1.0}
    y ==> double[4] { 4.0, 3.0, 2.0, 1.0 }
    // vector expression is not supported in Java
          
            Note that these operations are lazy. The computation is only performed when the results are needed, e.g. when the expression is used where a vector is expected. In the Shell, the expression is immediately performed because the Shell always prints out the results.
For a vector, there are multiple functions to calculate its norm such as norm (L2 norm), norm1 (L1 norm),
        norm2 (L2 norm), normInf (infinity norm), normFro (Frobenius norm).
        We can also standardize a vector to mean 0 and variance 1,
        unitize it so that L2 norm be 1,
        or unitize1 it so that L1 norm be 1.
    smile> norm(x)
    res13: Double = 5.477225575051661
    smile> unitize(y)
    smile> y
    res14: Array[Double] = Array(0.7302967433402214, 0.5477225575051661, 0.3651483716701107, 0.18257418583505536)
    
            
    smile> norm(x)
    $20 ==> 5.477225575051661
    smile> unitize(y)
    smile> y
    y ==> double[4] { 0.7302967433402214, 0.5477225575051661, 0.3651483716701107, 0.18257418583505536 }
          
            For a pair of vectors, we can calculate the dot product, distance, divergence, covariance,
        and correlations with dot, distance, kld (Kullback-Leibler Divergence),
        jsd (Jensen-Shannon Divergence), cov, cor (Pearson Correlation),
        spearman (Spearman Rank Correlation Coefficient), kendall (Kendall Tau Rank Correlation Coefficient).
    smile> dot(x, y)
    res16: Double = 3.651483716701107
    smile> cov(x, y)
    res17: Double = -0.30429030972509225
    
            
    smile> dot(x, y)
    res5: Double = 3.651483716701107
    smile> cov(x, y)
    res6: Double = -0.30429030972509225
          
            Matrix Operations
Like Matlab, we can use eye, zeros and ones
        to create identity, zero, or all-ones matrix, respectively.
        To create a matrix from 2-dimensional array, we can use the constructor matrix
        or the ~ operator.
        The ~ operator can be applied to 1-dimensional array too, which creates
        a single column matrix.
    val a = matrix(
        c(0.7220180, 0.07121225, 0.6881997),
        c(-0.2648886, -0.89044952, 0.3700456),
        c(-0.6391588, 0.44947578, 0.6240573)
    )
    val b = matrix(
        c(0.6881997, -0.07121225, 0.7220180),
        c(0.3700456, 0.89044952, -0.2648886),
        c(0.6240573, -0.44947578, -0.6391588)
    )
    val C = Array(
        Array(0.9527204, -0.2973347, 0.06257778),
        Array(-0.2808735, -0.9403636, -0.19190231),
        Array(0.1159052, 0.1652528, -0.97941688)
    )
    val c = ~C // or val c = matrix(C)
    
            
    import smile.math.matrix.*;
    double[][] A = {
        {0.7220180, 0.07121225, 0.6881997},
        {-0.2648886, -0.89044952, 0.3700456},
        {-0.6391588, 0.44947578, 0.6240573}
    };
    double[][] B = {
        {0.6881997, -0.07121225, 0.7220180},
        {0.3700456, 0.89044952, -0.2648886},
        {0.6240573, -0.44947578, -0.6391588}
    };
    double[][] C = {
        {0.9527204, -0.2973347, 0.06257778},
        {-0.2808735, -0.9403636, -0.19190231},
        {0.1159052, 0.1652528, -0.97941688}
    };
    var a = Matrix.of(A);
    var b = Matrix.of(B);
    var c = Matrix.of(C);
          
            In Scala, matrix-vector operations are just like in math formula.
    smile> val x = c(1.0, 2.0, 3.0)
    x: Array[Double] = Array(1.0, 2.0, 3.0)
    smile> val y = c(3.0, 2.0, 1.0)
    y: Array[Double] = Array(3.0, 2.0, 1.0)
    smile> val res: Array[Double] = a * x + 1.5 * y
    res: Array[Double] = Array(7.4290416, 2.06434916, 3.63196466)
    
            
    smile> double[] x = {1.0, 2.0, 3.0}
    x ==> double[3] { 1.0, 2.0, 3.0 }
    smile> double[] y = {3.0, 2.0, 1.0}
    y ==> double[3] { 3.0, 2.0, 1.0 }
    smile> a.mv(1.0, x, 1.5, y)
    $48 ==> double[3] { 7.4290416, 2.06434916, 3.63196466 }
          
            Similarly, for matrix-matrix operations:
    smile> a + b // returns a lazy expression of a + b
    res27: smile.math.MatrixAddMatrix =
        1.4102    0.0000    1.4102
        0.1052    0.0000    0.1052
       -0.0151    0.0000   -0.0151
    
            
    smile> a.add(b) // a = a + b
    $44 ==> 3 x 3
      1.4102    0.0000    1.4102
      0.1052    0.0000    0.1052
     -0.0151    0.0000   -0.0151
          
            Note that a * b are element-wise:
    smile> a * b // returns a lazy expression of a * b
    res28: smile.math.MatrixMulMatrix =
        0.4969   -0.0051    0.4969
       -0.0980   -0.7929   -0.0980
       -0.3989   -0.2020   -0.3989
    
            
    smile> a.mul(b) // a *= b
    $45 ==> 3 x 3
      0.4969   -0.0051    0.4969
     -0.0980   -0.7929   -0.0980
     -0.3989   -0.2020   -0.3989
          
            For matrix multiplication, the operator is %*%, same as in R
    smile> a %*% b %*% c
    [main] INFO smile.math.MatrixOrderOptimization - The minimum cost of matrix multiplication chain: 54
    res29: smile.math.MatrixExpression =
        0.9984    0.0067    0.0554
       -0.0257    0.9361    0.3508
       -0.0495   -0.3517    0.9348
    
            
    smile> a.mm(b).mm(c)
    $49 ==> 3 x 3
      0.9984    0.0067    0.0554
     -0.0257    0.9361    0.3508
     -0.0495   -0.3517    0.9348
          
            The method Matrix.transpose returns the transpose of matrix,
        which executes immediately. However, the method t is preferred
        on MatrixExpression as it is lazy.
    smile> a %*% b.t %*% c
    [main] INFO smile.math.MatrixOrderOptimization - The minimum cost of matrix multiplication chain: 54
    res30: smile.math.MatrixExpression =
        0.8978   -0.4369    0.0543
        0.4189    0.8856    0.2006
       -0.1357   -0.1574    0.9782
    
            
    smile> a.mt(b).mm(c)
    $50 ==> 3 x 3
      0.8978   -0.4369    0.0543
      0.4189    0.8856    0.2006
     -0.1357   -0.1574    0.9782
          
            Smile has runtime optimization for matrix multiplication chain, which can greatly improve the performance. Note that this optimization is only available in Scala API. In the below we generate several random matrices and multiply them together.
    val a = randn( 300,  900)
    val b = randn( 900,  150)
    val c = randn( 150, 1800)
    val d = randn(1800,   30)
    time("matrix multiplication") {(a %*% b %*% c %*% d).toMatrix}
    
            
    smile> var a = Matrix.randn( 300,  900)
    [main] INFO smile.math.MathEx - Set RNG seed 19650218 for thread main
    a ==> 300 x 900
     -0.9299   -0.4984    1.3793    1.8589  ... 4842   -0.5907  ...
      ...
    smile> var b = Matrix.randn( 900,  150)
    b ==> 900 x 150
      0.9851    0.9842    0.7543   -0.6598  ... 9706    0.9420  ...
      ...
    smile> var c = Matrix.randn( 150, 1800)
    c ==> 150 x 1800
      0.8682   -1.9094   -0.2466    0.1238 ... 2070   -1.1657  ...
      ...
    smile> var d = Matrix.randn(1800,   30)
    d ==> 1800 x 30
     -0.1421   -0.4016   -1.7960    0.2153  ... 6566   -1.0292  ...
      ...
    smile> a.mm(b).mm(c).mm(d)
    $55 ==> 300 x 30
    1027.7940  -7083.7899  20850.3728  14316.0928  3122.5039  6656.6392  -14332.0066  ...
    -15355.3544  18424.0367  3362.8806  1969.2299  -23705.3085  -8948.9324  7468.9138  ...
    -442.4282  7575.2694  -8070.4564  15107.1986  10726.3271  -170.4820  -19199.5856  ...
    4155.9123  -11273.9462  4326.8992  -276.7401  22746.9657  23260.6079  -1052.8137  ...
    27450.9909  -353.9005  26619.2334  -2807.0904  -18675.1774  -7891.4804  9164.3414  ...
    11257.9267  -12587.2370  -15836.0616  -8085.9522  -1277.4189  -11561.2331  -8508.3348  ...
    -7136.4159  3785.3912  -15033.8276  9799.7746  -16499.4337  16218.9645  13444.4842  ...
      ...
          
            where randn() creates a matrix of normally distributed
        random numbers. The shell will try to load machine optimized
        BLAS/LAPACK native libraries for most matrix computation.
        If BLAS/LAPACK is not available, smile will fall back to pure Java
        implementation.
Matrix Decomposition
In linear algebra, a matrix decomposition or matrix factorization
        is a factorization of a matrix into a product of matrices.
        There are various matrix decompositions. In Smile, we provide
        LU, QR, Cholesky, eigen, and SVD decomposition by functions
        lu, qr, cholesky,
        eigen, and svd, respectively.
With these decompositions, many important linear algebra operations
        can be performed such as calculating matrix rank, determinant, solving
        linear systems, computing inverse matrix, etc.
        In fact, Smile has functions det,
        rank, inv and operator \
        for these common computation.
    smile> val x = Array(1.0, 2.0, 3.0)
    x: Array[Double] = Array(1.0, 2.0, 3.0)
    smile> a \ x
    res14: Array[Double] = Array(2.9290414582113184, -0.9356509345036078, 2.131964578605774)
    smile> inv(a)
    res19: smile.math.matrix.Matrix =
      0.7220   -0.2649   -0.6392
      0.0712   -0.8904    0.4495
      0.6882    0.3700    0.6241
    smile> inv(a) %*% a
    res21: smile.math.MatrixExpression =
      1.0000    0.0000    0.0000
     -0.0000    1.0000    0.0000
     -0.0000    0.0000    1.0000
    
            
    smile> double[] x = {1.0, 2.0, 3.0}
    x ==> double[3] { 1.0, 2.0, 3.0 }
    smile> var inv = a.inverse()
    inv ==> 3 x 3
      0.7220   -0.2649   -0.6392
      0.0712   -0.8904    0.4495
      0.6882    0.3700    0.6241
    smile> inv.mm(a)
    $67 ==> 3 x 3
      1.0000   -0.0000    0.0000
     -0.0000    1.0000    0.0000
      0.0000    0.0000    1.0000
    smile> var lu = a.lu()
    lu ==> smile.math.matrix.Matrix$LU@5f375618
    smile> lu.solve(x)
    $69 ==> double[3] { -1.7252356779053744, -0.3612592362819077, 3.3004624918302046 }