Tutorial

Going through the tutorials should answer many questions for users new to FLENS:

We also try to give insight of concepts used in the design of FLENS.

Matrices, Vectors and Basic Linear Algebra Operations

Page 1

Full Storage Schemes and General Matrices

  • What is a general matrix?

  • Concept of dividing a matrix type into matrix interface and storage scheme.

  • Simple example on how-to use a general matrix.

  • What are matrix-views and how to use them.

  • How are matrix types and matrix views realized in FLENS?

Page 2

Array Storage and Dense Vectors

  • What is a dense vector?

  • Example for using dense vectors.

  • Working with vector views.

  • Vector views referencing matrix rows, columns or diagonals of a matrix.

Page 3

Accessing Raw Data Pointers and Strides

  • Accessing raw data pointers and strides.

  • Concepts for implementing numerical algorithms that require workspace buffers:

    • Design patterns.

    • Creating matrix/vector views from local buffers (Stack buffer).

    • Creating matrix/vector views from global buffers (Data segment buffer).

Page 4

Triangular/Trapezoidal Matrices with Full Storage

  • Why using a full storage for triangular matrices?

  • TrMatrix is not a triangular matrix. And it is a good thing.

    • Zeros are not explicitly stored. Accidently accessing the elements from the wrong triangular part causes an assertion error.

    • Sometimes you have to modify the upper triangualr part of a lower TrMatrix (and vice versa). We just help you don't do it by accident.

  • How (and why) GeMatrix and TrMatrix can share the same data:

    • TrMatrix views from GeMatrix

    • GeMatrix views from TrMatrix

Page 5

Symmetric and Hermitian Matrices with Full Storage

  • How (and why) GeMatrix and TrMatrix can share the same data:

    • TrMatrix views from SyMatrix and HeMatrix.

    • SyMatrix and HeMatrix views from TrMatrix.

Page 6

Basic Linear Algebra Operations (BLAS)

  • Who gives you High Performance? It's BLAS, not the Compiler!

  • Benchmarks comparing FLENS/ulmBLAS with others.

  • Vector operations.

  • Matrix-vector operations.

  • Matrix-matrix operations.

  • High level and low level interfaces for linear algebra operations.

  • Using overloaded operators and why you still have full control.

Page 7

Using Operators for BLAS

  • Why usage of operators has a bad reputation in high performance computing.

  • What users coming form Matlab should be aware of.

  • How FLENS provides a transparent and efficient solution.

  • How other C++ libraries deal with issues in the respect.

Page 8

Coding a High Performance LU Factorization with FLENS

  • That's what Matlab does when you solve a system of linear equations.

  • It's the algorithm!

    • Deriving algorithms that are based on matrix/vector operations.

    • Unblocked algorithm for the LU factorization.

    • Blocked algorithm for the LU factorization.

  • Implementation of the unblocked and blocked algorithm

    • Matrix/vector views are a big help!

    • Benchmarks

Page 9

Numerical Linear Algebra (FLENS-LAPACK, LAPACK)

  • Using FLENS-LAPACK.

  • Examples on:

    • Solving a system of linear equations.

    • Computing \(LU\) and \(QR\) factorizations.

    • Computing eigenvalues and eigenvectors.