[Semantics] InitContext NumOutput is hint
[platform/core/ml/nntrainer.git] / nntrainer / layers / concat_layer.cpp
2021-12-28 Jihoon Lee[Semantics] InitContext NumOutput is hint
2021-11-29 Jihoon Lee [layer devel] clean up header dependency
2021-11-25 Parichay Kapoor[layer] Add exportTo for concat layer
2021-10-21 Jihoon Lee[SVACE] Fix init problem
2021-10-21 Jihoon Lee[Concat] Add axis selection
2021-10-20 Parichay Kapoor[layer] Support concat along all the dimensions
2021-09-13 hyeonseok lee[layer] clean up parse_util header
2021-07-22 Parichay Kapoor[LayerV1] Delete for LayerV1
2021-07-22 Parichay Kapoor[layer] Update concat layer to V2
2021-07-22 Parichay Kapoor[layer] Remove num_inputs/num_outputs properties
2021-06-23 Parichay Kapoor[addition] Bug fix addition layer calcDerivative
2021-06-18 Jihoon Lee[CC17] Update type to inline static
2021-06-16 Jihoon Lee[Refactor] s/Layer/LayerV1
2021-04-27 Juyeong LeeFix indent in the SPDX identifier
2021-03-15 Parichay Kapoor[layer] refactor num of inputs/outputs
2021-01-21 Jihoon Lee[Layer] Add eval mode for the training accepted/tizen/unified/20210122.084701 submit/tizen/20210122.000930
2021-01-21 Parichay Kapoor[manager] Optimize input/output memory for inference
2020-12-30 Parichay Kapoor[model] Optimize model input/output
2020-12-28 Parichay Kapoor[layer] Use gradient instead of variable for derivative
2020-12-28 Parichay Kapoor[layer] Move layer input/output management to manager
2020-12-07 Parichay Kapoor[manager] Added nntrainer manager for weights
2020-11-30 jijoong.moon[ Graph ] remove grad mem buffer for backwarding
2020-11-27 Parichay Kapoor[layers] Split backwarding into smaller functions
2020-11-26 jijoong.moon[ Graph ] Modify backwarding/forwarding to use graph...
2020-11-19 Parichay Kapoor[ccapi] Direct methods to create layers
2020-11-06 Parichay Kapoor[restructure] Restructure the core files