Evan Shelhamer [Tue, 17 Feb 2015 05:41:24 +0000 (21:41 -0800)]
[docs] README dispatch
Evan Shelhamer [Tue, 17 Feb 2015 05:00:04 +0000 (21:00 -0800)]
[docs] note new CMake build
Evan Shelhamer [Tue, 17 Feb 2015 04:55:15 +0000 (20:55 -0800)]
Merge pull request #1667 from Nerei/feature/cmake_well_done
Improve CMake build with automation, options, and more.
Anatoly Baksheev [Sun, 1 Feb 2015 14:45:31 +0000 (17:45 +0300)]
cmake 2.8.7. support
Anatoly Baksheev [Sun, 18 Jan 2015 20:22:09 +0000 (23:22 +0300)]
[travis] proper cmake params
Anatoly Baksheev [Sun, 18 Jan 2015 19:55:26 +0000 (22:55 +0300)]
opencv 3.0 compilation (replace #1714)
Anatoly Baksheev [Wed, 17 Dec 2014 09:09:14 +0000 (12:09 +0300)]
improve CMake build
Anatoly Baksheev [Sat, 20 Dec 2014 15:22:47 +0000 (18:22 +0300)]
ignore qtcreator files
Evan Shelhamer [Tue, 17 Feb 2015 04:31:52 +0000 (20:31 -0800)]
Merge pull request #1313 from shelhamer/reshape-data-layer
Reshape single input batches for inputs of varying dimension
Evan Shelhamer [Fri, 6 Feb 2015 00:31:52 +0000 (16:31 -0800)]
test reshaping DATA and IMAGE_DATA
Evan Shelhamer [Fri, 17 Oct 2014 05:52:30 +0000 (22:52 -0700)]
reshape DATA + IMAGE_DATA for inputs of varying dimension
To feed inputs of varying dimension, the `DATA` and `IMAGE_DATA` layer
reshapes its prefetch and top blobs when the batch size is 1.
The `BasePrefetchingDataLayer` always reshapes on forward.
Evan Shelhamer [Tue, 17 Feb 2015 00:26:03 +0000 (16:26 -0800)]
Merge pull request #1884 from tnarihi/fix-python-draw
Fix `draw` to support new protobuf format
Takuya Narihira [Tue, 17 Feb 2015 00:06:03 +0000 (16:06 -0800)]
Fix `draw` to support new protobuf format
James Steven Supancic III [Thu, 12 Feb 2015 21:57:25 +0000 (13:57 -0800)]
Fix Draw Net Problem #1709
Introduced by Layer type is a string #1694
Jeff Donahue [Mon, 16 Feb 2015 01:14:11 +0000 (17:14 -0800)]
Merge pull request #1874 from jeffdonahue/blob-math-test-precision
BlobMathTest: fix precision issues
Jeff Donahue [Mon, 16 Feb 2015 01:05:10 +0000 (17:05 -0800)]
BlobMathTest: fixes for numerical precision issues
Jeff Donahue [Sat, 14 Feb 2015 01:41:43 +0000 (17:41 -0800)]
Merge pull request #1757 from jeffdonahue/clip-grads
Gradient clipping
Jeff Donahue [Tue, 7 Oct 2014 06:46:48 +0000 (23:46 -0700)]
Add gradient clipping -- limit L2 norm of parameter gradients
Jeff Donahue [Mon, 19 Jan 2015 22:49:39 +0000 (14:49 -0800)]
add Net::param_owners accessor for param sharing info
Jeff Donahue [Tue, 7 Oct 2014 06:45:43 +0000 (23:45 -0700)]
Blob: add scale_{data,diff} methods and tests
Jeff Donahue [Thu, 5 Feb 2015 22:04:35 +0000 (14:04 -0800)]
SoftmaxWithLossLayer fix: takes exactly 2 bottom blobs (inherited from
LossLayer)
Evan Shelhamer [Tue, 10 Feb 2015 19:43:12 +0000 (11:43 -0800)]
Merge pull request #1841 from shelhamer/no-memory-or-hdf5-transform
HDF5_DATA + MEMORY_DATA refuse loudly to transform
Evan Shelhamer [Tue, 10 Feb 2015 19:21:45 +0000 (11:21 -0800)]
Merge pull request #1851 from jeffdonahue/cudnn-layer-factory-test-fix
Fix for CuDNN layer tests: only destroy handles if setup
Jeff Donahue [Tue, 10 Feb 2015 02:08:50 +0000 (18:08 -0800)]
Fixes for CuDNN layers: only destroy handles if setup
Jon Long [Mon, 9 Feb 2015 09:13:16 +0000 (01:13 -0800)]
Merge pull request #1838 from DmitryUlyanov/dev
Bug in MemoryData that does not allow using arrays with n_ * size_ > 2^31
Evan Shelhamer [Sat, 7 Feb 2015 07:12:33 +0000 (23:12 -0800)]
HDF5_DATA + MEMORY_DATA refuse loudly to transform
These layers are meant for generic inputs so they do not do
transformations. As `transform_param` was pulled out of data layers'
proto definitions for reuse, these layers silently ignored
transformation parameters until now instead of signalling their refusal.
Dmitry Ulyanov [Fri, 6 Feb 2015 07:56:47 +0000 (10:56 +0300)]
Allow using arrays with n_ * size_ > 2^31
uint64_t
size_t
Evan Shelhamer [Sat, 7 Feb 2015 06:05:12 +0000 (22:05 -0800)]
Merge pull request #1416 from mtamburrano/matVector
Feed cv::Mats to MemoryDataLayer and set batch size on-the-fly.
Evan Shelhamer [Sat, 7 Feb 2015 05:37:54 +0000 (21:37 -0800)]
groom #1416
- keep current `DataTransformer` check so that datums can be transformed
into a blob incrementally
- standardize check messages
- include opencv where needed, drop unneeded OS X guards
TODO these tests need to be de-duped
manuele [Mon, 12 Jan 2015 11:22:49 +0000 (12:22 +0100)]
removed needs_reshape_ and ChangeBatchSize is now set_batch_size
manuele [Thu, 8 Jan 2015 18:26:20 +0000 (19:26 +0100)]
small fixes
manuele [Tue, 9 Dec 2014 14:12:28 +0000 (15:12 +0100)]
MemoryDataLayer now correctly consumes batch_size elements
manuele [Mon, 10 Nov 2014 18:24:02 +0000 (19:24 +0100)]
MemoryDataLayer now accepts dynamic batch_size
manuele [Fri, 7 Nov 2014 19:01:33 +0000 (20:01 +0100)]
Added opencv vector<Mat> to memory data layer with tests
Evan Shelhamer [Sat, 7 Feb 2015 04:40:40 +0000 (20:40 -0800)]
Merge pull request #1789 from SaganBolliger/softmax_loss_gpu
GPU version of SoftmaxWithLossLayer
Sagan Bolliger [Fri, 6 Feb 2015 20:54:32 +0000 (12:54 -0800)]
Added GPU implementation of SoftmaxWithLossLayer.
Jeff Donahue [Fri, 6 Feb 2015 19:47:49 +0000 (11:47 -0800)]
Merge pull request #1837 from shelhamer/image-fail-die
Die on inputs that fail to load
Jeff Donahue [Fri, 6 Feb 2015 19:46:48 +0000 (11:46 -0800)]
Merge pull request #1840 from shelhamer/fix-power-test
Fix PowerLayer gradient check failures by reducing step size
Evan Shelhamer [Fri, 6 Feb 2015 09:37:02 +0000 (01:37 -0800)]
reduce step size in PowerLayer gradient checks: fix #1252
The gradient checker fails on certain elements of the PowerLayer checks,
but only 1-3 sometimes fail out of the 120 elements tested. This is not
due to any numerical issue in the PowerLayer, but the distribution of
the random inputs for the checks.
boost 1.56 switched the normal distribution RNG engine from Box-Muller
to Ziggurat.
Evan Shelhamer [Fri, 6 Feb 2015 08:51:25 +0000 (00:51 -0800)]
build with libc++ on Yosmite with CUDA 7
Jeff Donahue [Fri, 6 Feb 2015 04:05:20 +0000 (20:05 -0800)]
Merge pull request #1836 from jeffdonahue/loss-param-upgrade-fix
fix for layer-type-str: loss_param in V1LayerParameter
Jeff Donahue [Fri, 6 Feb 2015 03:28:42 +0000 (19:28 -0800)]
fix for layer-type-str: loss_param and DECONVOLUTION type should have
been included in V1LayerParameter, get upgraded
Evan Shelhamer [Fri, 6 Feb 2015 00:41:46 +0000 (16:41 -0800)]
die on inputs to IMAGE_DATA that fail to load
It's better to know than march silently on.
Jeff Donahue [Fri, 6 Feb 2015 00:06:17 +0000 (16:06 -0800)]
Merge pull request #1694 from jeffdonahue/layer-type-str
Layer type is a string
Jeff Donahue [Thu, 5 Feb 2015 23:17:24 +0000 (15:17 -0800)]
Upgrade existing nets using upgrade_net_proto_text tool
Restore comments afterwards
Jeff Donahue [Thu, 15 Jan 2015 09:43:23 +0000 (01:43 -0800)]
start layer parameter field IDs at 100
(always want them printed at the end, and want to allow more fields to
be added in the future, so reserve fields 10-99 for that purpose)
Jeff Donahue [Thu, 15 Jan 2015 08:13:23 +0000 (00:13 -0800)]
get rid of NetParameterPrettyPrint as layer is now after inputs
(whoohoo)
Jeff Donahue [Thu, 15 Jan 2015 04:17:26 +0000 (20:17 -0800)]
add message ParamSpec to replace param name, blobs_lr, weight_decay, ...
Jeff Donahue [Thu, 15 Jan 2015 03:00:58 +0000 (19:00 -0800)]
add test that all V1 layer type enum values upgrade to valid V2 string
types
Jeff Donahue [Thu, 15 Jan 2015 02:46:21 +0000 (18:46 -0800)]
add v1 to v2 upgrade tests
Jeff Donahue [Thu, 15 Jan 2015 02:31:23 +0000 (18:31 -0800)]
restore test_upgrade_proto to dev version
Jeff Donahue [Thu, 15 Jan 2015 00:55:14 +0000 (16:55 -0800)]
automagic upgrade for v1->v2
Jeff Donahue [Wed, 14 Jan 2015 23:07:20 +0000 (15:07 -0800)]
restore upgrade_proto
Jeff Donahue [Tue, 13 Jan 2015 01:02:15 +0000 (17:02 -0800)]
'layers' -> 'layer'
Jeff Donahue [Mon, 12 Jan 2015 21:42:51 +0000 (13:42 -0800)]
Add unit test for LayerRegistry::CreateLayer
Jeff Donahue [Mon, 12 Jan 2015 22:27:06 +0000 (14:27 -0800)]
DataLayer and HDF5OutputLayer can be constructed and destroyed without
errors
Jeff Donahue [Thu, 8 Jan 2015 04:41:09 +0000 (20:41 -0800)]
Layer type is a string
Evan Shelhamer [Wed, 4 Feb 2015 18:15:21 +0000 (10:15 -0800)]
fix Nesterov typo found by @bamos
Pannous [Fri, 30 Jan 2015 12:38:48 +0000 (13:38 +0100)]
fixed small bug: output label_file -> label_filename
Jeff Donahue [Mon, 2 Feb 2015 20:38:43 +0000 (12:38 -0800)]
add space after "Loading mean file from"
Evan Shelhamer [Mon, 2 Feb 2015 17:45:51 +0000 (09:45 -0800)]
fix GoogLeNet license overwritten by back-merge (see #1650)
Evan Shelhamer [Sun, 1 Feb 2015 07:53:58 +0000 (23:53 -0800)]
Merge pull request #1615 from longjon/deconv-layer
Add deconvolution layer with refactoring of convolution layer to share code
Evan Shelhamer [Fri, 30 Jan 2015 04:58:58 +0000 (20:58 -0800)]
Merge pull request #1748 from longjon/db-wrappers
Simple database wrappers
Evan Shelhamer [Fri, 30 Jan 2015 04:52:32 +0000 (20:52 -0800)]
Merge pull request #1794 from shelhamer/no-dump-net
drop dump_network tool
Evan Shelhamer [Fri, 30 Jan 2015 04:52:16 +0000 (20:52 -0800)]
Merge pull request #1654 from longjon/softmax-missing-values
Add missing value support to SoftmaxLossLayer
Jeff Donahue [Fri, 30 Jan 2015 03:17:46 +0000 (19:17 -0800)]
Merge pull request #1753 from jeffdonahue/enhance-debug-info
Enhance debug info
Jeff Donahue [Mon, 19 Jan 2015 19:52:04 +0000 (11:52 -0800)]
debug_info in NetParameter so it can be enabled outside training
Jeff Donahue [Tue, 7 Oct 2014 06:46:05 +0000 (23:46 -0700)]
debug_info: print param (and gradient) stats for whole Net after Backward
Jeff Donahue [Fri, 30 Jan 2015 03:07:12 +0000 (19:07 -0800)]
Add BlobMathTest with unit tests for sumsq and asum
Jeff Donahue [Tue, 7 Oct 2014 06:08:38 +0000 (23:08 -0700)]
Blob: add sumsq_{data,diff} methods
Jeff Donahue [Wed, 24 Sep 2014 22:52:14 +0000 (15:52 -0700)]
Enhancements for debug_info to display more information.
Now displays for:
-net inputs
-test nets
-params in ForwardDebugInfo
Jonathan L Long [Tue, 27 Jan 2015 21:27:48 +0000 (13:27 -0800)]
[test] gradient checks for softmax ignore_label and normalize: false
Jonathan L Long [Tue, 27 Jan 2015 21:09:32 +0000 (13:09 -0800)]
document the loss_param options to SoftmaxWithLossLayer
Jonathan L Long [Tue, 27 Jan 2015 20:24:39 +0000 (12:24 -0800)]
[test] simple test for DeconvolutionLayer
Jonathan L Long [Tue, 27 Jan 2015 18:45:41 +0000 (10:45 -0800)]
document DeconvolutionLayer
Evan Shelhamer [Mon, 26 Jan 2015 07:13:52 +0000 (23:13 -0800)]
Merge pull request #1555 from drdan14/draw-net-improvements
Improvements to network drawing via draw_net.py
Evan Shelhamer [Mon, 26 Jan 2015 07:08:38 +0000 (23:08 -0800)]
Merge pull request #1632 from 7hil/cifar_lmdb
switch cifar10 example to lmdb
Evan Shelhamer [Mon, 26 Jan 2015 07:06:57 +0000 (23:06 -0800)]
Merge pull request #1746 from dj1989/mat_hdf5_demo
Matlab demo for Caffe-compatible HDF5 read/write
Evan Shelhamer [Mon, 26 Jan 2015 07:05:33 +0000 (23:05 -0800)]
Merge pull request #1755 from jeffdonahue/softmax-optimization
SoftmaxLayer GPU optimization
Evan Shelhamer [Mon, 26 Jan 2015 04:37:52 +0000 (20:37 -0800)]
[pycaffe] de-dupe imports
Evan Shelhamer [Mon, 26 Jan 2015 00:04:51 +0000 (16:04 -0800)]
[example] lenet early stopping -> mnist examples
Jeff Donahue [Sun, 25 Jan 2015 22:06:48 +0000 (14:06 -0800)]
Merge pull request #1754 from jeffdonahue/softmax-loss-fix
SoftmaxWithLossLayer: use CreateLayer
Evan Shelhamer [Sat, 24 Jan 2015 06:46:53 +0000 (22:46 -0800)]
drop dump_network tool
Nets are better serialized as a single binaryproto or saved however
desired through the Python and MATLAB interfaces.
Jeff Donahue [Fri, 23 Jan 2015 04:03:15 +0000 (20:03 -0800)]
Merge pull request #1787 from shelhamer/pytest-caffe-set
[fix] align pytest for solver with #1728
Evan Shelhamer [Fri, 23 Jan 2015 03:54:57 +0000 (19:54 -0800)]
[fix] align pytest for solver with #1728
Jeff Donahue [Fri, 23 Jan 2015 03:45:34 +0000 (19:45 -0800)]
Merge pull request #1786 from xianjiec/dev
fix bugs by adding const
Xianjie Chen [Fri, 23 Jan 2015 03:27:24 +0000 (19:27 -0800)]
fix bugs by adding const
Evan Shelhamer [Thu, 22 Jan 2015 08:29:35 +0000 (00:29 -0800)]
Merge pull request #1473 from longjon/pytest
Python testing
Jeff Donahue [Sun, 2 Nov 2014 09:56:19 +0000 (01:56 -0800)]
hdf5_save_nd_dataset takes a const string& (instead of const string)
Jeff Donahue [Sun, 2 Nov 2014 04:37:05 +0000 (21:37 -0700)]
SoftmaxWithLossLayer: use CreateLayer so that a CuDNNSoftmaxLayer
is created if available
Evan Shelhamer [Wed, 21 Jan 2015 23:43:49 +0000 (15:43 -0800)]
Back-merge fixes + docs
and other fixes and documentation updates.
Jeff Donahue [Sun, 2 Nov 2014 04:22:11 +0000 (21:22 -0700)]
Unroll kernels in SoftmaxLayer...from terrible performance to mediocre
performance.
Jon Long [Tue, 20 Jan 2015 00:59:48 +0000 (16:59 -0800)]
Merge pull request #1756 from jeffdonahue/max-total-bytes-limit
Max out the protobuf file read size limit
Jeff Donahue [Thu, 16 Oct 2014 20:24:39 +0000 (13:24 -0700)]
SetTotalBytesLimit to the max (2 GB minus 1 byte)
Jonathan L Long [Fri, 16 Jan 2015 21:25:00 +0000 (13:25 -0800)]
gut dataset wrappers
Sergio Guadarrama [Fri, 16 Jan 2015 21:20:26 +0000 (13:20 -0800)]
test db wrappers
Jonathan L Long [Fri, 16 Jan 2015 21:20:15 +0000 (13:20 -0800)]
use db wrappers
Sergio Guadarrama [Fri, 16 Jan 2015 21:18:50 +0000 (13:18 -0800)]
add db wrappers
Jon Long [Mon, 19 Jan 2015 04:29:47 +0000 (20:29 -0800)]
Merge pull request #1747 from yosinski/doc-up
Updated doc to suggest boost 1.57
Jason Yosinski [Sun, 18 Jan 2015 04:07:36 +0000 (23:07 -0500)]
Updated doc to suggest boost 1.57