documentation.md 2.26 KB
Newer Older
1
2
3
4
5
---
layout: post
section: "Documentation"
---

Ben Huber's avatar
Ben Huber committed
6
7
# Documentation

Ben Huber's avatar
Ben Huber committed
8
9
10
11
12
13
__warnStart

This Documentation is still **work in progress**. Several Sections are currently missing but will hopefully be added shortly.

__warnEnd

Ben Huber's avatar
Ben Huber committed
14
This is the semi-complete documentation of the `xerus` library. It does not provide you with precise function declarations or
15
class hierarchies (check out the [doxygen documentation](/doxygen) for those) but instead focuses on small working code snippets
Ben Huber's avatar
Ben Huber committed
16
17
18
19
20
21
to demonstrate `xerus`'s capabilities. 

The documentation is categorized into chapters that each focus on individual features of the library. In particular the chapters in the "Basic
Usage" section only build upon the knowledge of the previous chapters so that they can easily be read in order. They are still
useful as lookup resources later on though.

22
23
If you have not already done so, you will most likely want to start by downloading and [building xerus](/building_xerus). If
you are uncertain or confused by our nomenclature at any point you can reference the [nomenclature](/nomenclature) chapter which
Ben Huber's avatar
Ben Huber committed
24
25
should hopefully allow you to deduce the meaning of all tensor specific terms we use.

26
27
28
29
30
31
32
The "Basic Usage" section starts out with the creation and modification of (sparse or dense) tensors ([The Tensor Class](/tensor)),
how to use them in indexed equations to denote contractions, summations etc. ([Indices and Equations](/indices)) and finally 
how to denote generalizations of matrix decompositions ([Decompositions and Solve](/decompositions)) before explainig any more
elaborate tensor format. The focus of `xerus` so far clearly lies on the Tensor Train decomposition ([TT-Tensors](/tttensors)) and
algorithms for those to solve least squares problems ([Alternating Algorithms](/als), [Riemannian Algorithms](/riemannian)) or
tensor recovery and completion problems ([Tensor Completion / Recovery](/completion)). It is possible to use `xerus`'s capabilities
to construct and contract arbitrary tensor networks though ([General Tensor Networks](/tensornetworks)).
Ben Huber's avatar
Ben Huber committed
33
34

In the "Advanced Usage" section you find instructions on how to optimize your usage of the `xerus` library to gain those last
35
10-20% of speedup ([Optimization](/optimization)) and explanations of the debugging tools that `xerus` gives you ([Debugging](/debugging)).