Fix indent in the SPDX identifier
[platform/core/ml/nntrainer.git] / nntrainer / tensor / weight.h
2021-04-27 Juyeong LeeFix indent in the SPDX identifier
2021-03-19 Parichay Kapoor[Weights] Split weight variable init and alloc
2021-03-16 Parichay Kapoor[all] remove friends
2021-03-08 Parichay Kapoor[manager] Support deallocation of memory
2021-03-03 Parichay Kapoor[tensor] Allow updating batch size after allocation
2021-03-03 Parichay Kapoor[optimizer] Update to camelcase
2021-03-03 Parichay Kapoor[weight] Move apply gradient to weight
2021-03-03 Parichay Kapoor[weight/layer] Move weight regularization out of layers
2021-02-05 Parichay Kapoor[Weight] Cleanup train argument for initialize gradient
2021-02-05 Parichay Kapoor[var_grad] Improve nomenclature
2021-02-05 Parichay Kapoor[Manager] Support lazy memory allocation with manager
2021-01-26 MyungJoo HamLicense Fix / Relicense to Apache-2.0
2021-01-25 Parichay Kapoor[var_grad] Remove redundant argument for initializeWeight
2021-01-25 Parichay Kapoor[weight] Decouple init of weight and gradients
2021-01-25 Parichay Kapoor[manager] Donot allocate adam for inference
2020-12-24 Jihoon Lee[Tensor] Add Tensor Wrap method
2020-12-10 Parichay Kapoor[network] Rework the backwarding
2020-12-10 Parichay Kapoor[optimizer] Move optimizer out of layer
2020-12-10 Parichay Kapoor[optimizer] Move optimizer variables to weights
2020-12-07 Parichay Kapoor[manager] Share gradient memory for all layer
2020-12-07 Parichay Kapoor[layers/manager] Register weights with manager
2020-12-07 Parichay Kapoor[weight] Updated weights to be vector
2020-12-07 Parichay Kapoor[weight/var_grad] Make internal variable as shared_ptr
2020-11-27 Parichay Kapoor[var_grad] Add var_grad for input/output lists
2020-11-06 Parichay Kapoor[restructure] Restructure the core files