xerus issueshttps://git.hemio.de/xerus/xerus/-/issues2019-06-13T12:57:05+02:00https://git.hemio.de/xerus/xerus/-/issues/186tests for python-interface2019-06-13T12:57:05+02:00Fuchsi*tests for python-interfaceTo ensure that the python wrapper works as intended we should write a number of unittests in python. As any of these tests also test the library itself it might be worthwhile to replace a larger portion of the unittests with a python variant.
One aspect should be a comparison to numpy-results.
Current errors in the pybind11-bindings:
- [x] `xe.Tensor(xe.TTTensor([2]))` raises `ValueError`
- [x] segfault in `test_pickle.py` with python2-bindingsTo ensure that the python wrapper works as intended we should write a number of unittests in python. As any of these tests also test the library itself it might be worthwhile to replace a larger portion of the unittests with a python variant.
One aspect should be a comparison to numpy-results.
Current errors in the pybind11-bindings:
- [x] `xe.Tensor(xe.TTTensor([2]))` raises `ValueError`
- [x] segfault in `test_pickle.py` with python2-bindingsVersion 4.0https://git.hemio.de/xerus/xerus/-/issues/172Add TT-Cross Approximation2019-03-04T12:39:27+01:00Sebastian WolfAdd TT-Cross ApproximationVersion XPhilipp TrunschkePhilipp Trunschkehttps://git.hemio.de/xerus/xerus/-/issues/118new contraction heuristics2017-07-13T16:42:00+02:00Fuchsi*new contraction heuristics- [ ] implement brute force search for small number of nodes
- [ ] write more elaborate heuristics
- [ ] statistics about which heuristics were the best how often- [ ] implement brute force search for small number of nodes
- [ ] write more elaborate heuristics
- [ ] statistics about which heuristics were the best how oftenVersion Xhttps://git.hemio.de/xerus/xerus/-/issues/79Allow transformations between arbitary TensorNetworks without casting to Full...2019-03-04T14:30:32+01:00Sebastian WolfAllow transformations between arbitary TensorNetworks without casting to FullTensor.Version Xhttps://git.hemio.de/xerus/xerus/-/issues/42templatized value_t to allow calculations with complex numbers2019-03-04T14:54:41+01:00Fuchsi*templatized value_t to allow calculations with complex numbersThis would also simplify issue #39 and is required for most quantum calculations.
- [ ] overloaded wrapper functions for lapacke, blas and suitesparse
- [ ] prob. two template types: value_t and real_t (return value of forb_norm eg. should not be complex)
- [ ] optionally interaction between different templatized versions (eg. Tensor<double> with Tensor<float>?)This would also simplify issue #39 and is required for most quantum calculations.
- [ ] overloaded wrapper functions for lapacke, blas and suitesparse
- [ ] prob. two template types: value_t and real_t (return value of forb_norm eg. should not be complex)
- [ ] optionally interaction between different templatized versions (eg. Tensor<double> with Tensor<float>?)Version Xhttps://git.hemio.de/xerus/xerus/-/issues/38flop counts of all operations to better be able to compare algorithms2018-04-01T13:07:03+02:00Fuchsi*flop counts of all operations to better be able to compare algorithmsruntimes are ok, but heavily dependent on optimization efforts. FLOP counts are much more robust across architectures and independent of the writing effort put into the algorithms.runtimes are ok, but heavily dependent on optimization efforts. FLOP counts are much more robust across architectures and independent of the writing effort put into the algorithms.Version Xhttps://git.hemio.de/xerus/xerus/-/issues/216Add convenience functions for modifying the TensorNetwork graph.2019-02-15T11:43:30+01:00Philipp TrunschkeAdd convenience functions for modifying the TensorNetwork graph.There are some operations on `TensorNetworks` that are not as simple as they should be.
As an example I think that a `TensorNetworks` should have the functions
- `TensorNetwork::remove_node(const size_t _nodeId)` : removes the node from the TensorNetwork and inserts the newly created external links at the end of `externalLinks` (in the order they had on the node tensor)
- `TensorNetwork::remove_link(const size_t _nodeId1, const size_t _nodeId2)` : removes the link between the given nodes and inserts the newly created external links at the end of `externalLinks`
The first function is useful when computing the gradient of a `TensorNetwork` w.r.t. the given node.
The second one will probably be used mainly as a subroutine of the first one.
Related tasks like `add_node` can be done using Einstein notation. Maybe we can find something similar for these tasks.There are some operations on `TensorNetworks` that are not as simple as they should be.
As an example I think that a `TensorNetworks` should have the functions
- `TensorNetwork::remove_node(const size_t _nodeId)` : removes the node from the TensorNetwork and inserts the newly created external links at the end of `externalLinks` (in the order they had on the node tensor)
- `TensorNetwork::remove_link(const size_t _nodeId1, const size_t _nodeId2)` : removes the link between the given nodes and inserts the newly created external links at the end of `externalLinks`
The first function is useful when computing the gradient of a `TensorNetwork` w.r.t. the given node.
The second one will probably be used mainly as a subroutine of the first one.
Related tasks like `add_node` can be done using Einstein notation. Maybe we can find something similar for these tasks.https://git.hemio.de/xerus/xerus/-/issues/236Audit Docker2019-06-19T09:35:34+02:00RoteKekseAudit DockerVersion 4.1RoteKekseRoteKeksehttps://git.hemio.de/xerus/xerus/-/issues/238Revisit Conda Build2019-06-19T09:35:22+02:00RoteKekseRevisit Conda BuildVersion 4.1RoteKekseRoteKeksehttps://git.hemio.de/xerus/xerus/-/issues/262Slice TTTensor2020-04-21T10:19:54+02:00Nando FarchminSlice TTTensorWould be good to get a subtensor, i.e. slice of the tensor (a subtensor from dimensions (l_1,...,l_n) to dimensions (u_1,...,u_n) where 0 <= l_i <= u_i <= tensor_dim_i, see code below.)
```
def slice_tt(tt, lower, upper):
"""
Slice TTTensor in each component from lower to (not including) upper dimension.
"""
assert len(lower) == len(upper) == len(tt.dimensions)
assert np.all(np.array(lower) < np.array(upper))
assert np.all(np.array(upper) <= np.array(tt.dimensions))
diff = [u-l for l,u in zip(lower,upper)]
# TODO there should be a more elegant way to slice TTTensors!
tmp = xe.TTTensor(tt)
for pos in range(tt.order()):
tmp.move_core(pos)
cmp = np.asarray(tmp.get_component(pos))
cmp[:, :lower[pos], :] = 0
cmp[:, upper[pos]:, :] = 0
tmp.set_component(pos, xe.Tensor.from_buffer(cmp))
tmp.move_core(0)
tt_slice = xe.TTTensor.random(diff, tmp.ranks())
for pos in range(tt.order()):
cmp = np.asarray(tmp.get_component(pos))
cmp = np.array(cmp[:, lower[pos]:upper[pos], :])
tt_slice.set_component(pos, xe.Tensor.from_buffer(cmp))
return tt_slice
```Would be good to get a subtensor, i.e. slice of the tensor (a subtensor from dimensions (l_1,...,l_n) to dimensions (u_1,...,u_n) where 0 <= l_i <= u_i <= tensor_dim_i, see code below.)
```
def slice_tt(tt, lower, upper):
"""
Slice TTTensor in each component from lower to (not including) upper dimension.
"""
assert len(lower) == len(upper) == len(tt.dimensions)
assert np.all(np.array(lower) < np.array(upper))
assert np.all(np.array(upper) <= np.array(tt.dimensions))
diff = [u-l for l,u in zip(lower,upper)]
# TODO there should be a more elegant way to slice TTTensors!
tmp = xe.TTTensor(tt)
for pos in range(tt.order()):
tmp.move_core(pos)
cmp = np.asarray(tmp.get_component(pos))
cmp[:, :lower[pos], :] = 0
cmp[:, upper[pos]:, :] = 0
tmp.set_component(pos, xe.Tensor.from_buffer(cmp))
tmp.move_core(0)
tt_slice = xe.TTTensor.random(diff, tmp.ranks())
for pos in range(tt.order()):
cmp = np.asarray(tmp.get_component(pos))
cmp = np.array(cmp[:, lower[pos]:upper[pos], :])
tt_slice.set_component(pos, xe.Tensor.from_buffer(cmp))
return tt_slice
```Version 4.1Philipp TrunschkePhilipp Trunschke