Jonathan L Long [Tue, 6 May 2014 23:40:09 +0000 (16:40 -0700)]
improve includes in util/math_function.hpp
This commit removes the redundant <math.h>, and adds the necessary
<stdint.h> and <glog/logging.h>.
Jeff Donahue [Tue, 6 May 2014 18:59:25 +0000 (11:59 -0700)]
note bug in cifar10 full with CPU computation
Evan Shelhamer [Fri, 2 May 2014 21:11:31 +0000 (14:11 -0700)]
Merge pull request #294 from longjon/memory-data-layer
Add a layer for in-memory data, and expose it to Python
Jonathan L Long [Fri, 25 Apr 2014 21:56:44 +0000 (14:56 -0700)]
fix lint error in syncedmem.hpp
Jonathan L Long [Thu, 17 Apr 2014 10:06:20 +0000 (03:06 -0700)]
pycaffe: allow 1d labels to be passed to set_input_arrays
Jonathan L Long [Fri, 25 Apr 2014 21:33:25 +0000 (14:33 -0700)]
pycaffe: add Net.set_input_arrays for input from numpy
This requires a net whose first layer is a MemoryDataLayer.
Jonathan L Long [Fri, 25 Apr 2014 21:29:50 +0000 (14:29 -0700)]
pycaffe: store a shared_ptr<CaffeNet> in SGDSolver
Doing this, rather than constructing the CaffeNet wrapper every time,
will allow the wrapper to hold references that last at least as long as
SGDSolver (which will be necessary to ensure that data used by
MemoryDataLayer doesn't get freed).
Jonathan L Long [Fri, 25 Apr 2014 21:09:02 +0000 (14:09 -0700)]
pycaffe: let boost pass shared_ptr<CaffeNet>
Jonathan L Long [Thu, 17 Apr 2014 09:24:02 +0000 (02:24 -0700)]
add basic tests for MemoryDataLayer
Jonathan L Long [Fri, 4 Apr 2014 22:20:02 +0000 (15:20 -0700)]
add size accessors to MemoryDataLayer
This will facilitate input size checking for pycaffe (and potentially
others).
Jonathan L Long [Fri, 4 Apr 2014 02:27:21 +0000 (19:27 -0700)]
add MemoryDataLayer for reading input from contiguous blocks of memory
Jonathan L Long [Fri, 4 Apr 2014 02:19:47 +0000 (19:19 -0700)]
add set_cpu_data to Blob and SyncedMemory
This allows a blob to be updated without copy to use already existing
memory (and will support MemoryDataLayer).
Jeff Donahue [Wed, 30 Apr 2014 19:53:32 +0000 (12:53 -0700)]
Merge pull request #378 from longjon/no-merge-duplicates
Note the last added layer/params in caffe.proto to prevent conflicts
Jonathan L Long [Wed, 30 Apr 2014 19:28:55 +0000 (12:28 -0700)]
note the last added layer/params in caffe.proto to prevent conflicts
The current scheme does not actually prevent conflicts, since three-way
merge will accept simultaneous changes that agree on the next number.
This commit fixes this by explicitly noting the last layer added.
Evan Shelhamer [Wed, 30 Apr 2014 18:37:56 +0000 (11:37 -0700)]
Merge pull request #377 from sguada/fix_initial_test
Keep same format for all net testing log messages
Sergio Guadarrama [Wed, 30 Apr 2014 18:26:55 +0000 (11:26 -0700)]
Keep uniform test messages for all the test
Jeff Donahue [Tue, 29 Apr 2014 04:50:00 +0000 (21:50 -0700)]
rollback 8368818, does not build
Sergey Karayev [Tue, 29 Apr 2014 02:47:17 +0000 (19:47 -0700)]
Add Sublime Text project settings to gitignore
Sergey Karayev [Tue, 29 Apr 2014 02:44:32 +0000 (19:44 -0700)]
Handling CUBLAS_STATUS_NOT_SUPPORTED to suppress warning
Jeff Donahue [Sat, 26 Apr 2014 23:46:39 +0000 (16:46 -0700)]
Merge pull request #370 from shelhamer/test-net-polish
Polish test_net
Evan Shelhamer [Sat, 26 Apr 2014 23:33:40 +0000 (16:33 -0700)]
fix test_net to upgrade params from v0 if needed
Evan Shelhamer [Sat, 26 Apr 2014 23:08:22 +0000 (16:08 -0700)]
default test net device to 0 and log device chosen
Jeff Donahue [Sat, 26 Apr 2014 22:47:35 +0000 (15:47 -0700)]
set seed in neuron and power layer tests for deterministic results
Jeff Donahue [Sat, 26 Apr 2014 22:45:11 +0000 (15:45 -0700)]
Merge pull request #367 from shelhamer/randomize-test-order
Randomize order of test execution by `make runtest`
Evan Shelhamer [Sat, 26 Apr 2014 22:23:56 +0000 (15:23 -0700)]
proofreading and trivial polish
Evan Shelhamer [Sat, 26 Apr 2014 22:22:42 +0000 (15:22 -0700)]
add device id arg to test_net (fix #232)
Jeff Donahue [Sat, 26 Apr 2014 22:13:15 +0000 (15:13 -0700)]
Merge pull request #303 from longjon/hinge-loss-layer
HingeLossLayer
Jeff Donahue [Sat, 26 Apr 2014 22:09:17 +0000 (15:09 -0700)]
Merge pull request #368 from shelhamer/fix-data-layer-sequence-tests
Fix data layer sequence tests to avoid leveldb contention and lock failures
Evan Shelhamer [Sat, 26 Apr 2014 21:57:11 +0000 (14:57 -0700)]
scope data layer sequence tests to avoid leveldb contention
Evan Shelhamer [Sat, 26 Apr 2014 21:42:38 +0000 (14:42 -0700)]
set phase in dropout tests
Evan Shelhamer [Sat, 26 Apr 2014 01:54:12 +0000 (18:54 -0700)]
randomize order of test execution by make runtest
Jeff Donahue [Sat, 26 Apr 2014 01:42:37 +0000 (18:42 -0700)]
Merge pull request #366 from shelhamer/test-phase-fix
set TRAIN in CommonTest.TestPhase
Evan Shelhamer [Sat, 26 Apr 2014 01:33:40 +0000 (18:33 -0700)]
set TRAIN in CommonTest.TestPhase
...to avoid effects of test execution order. The original intention was
to check that TRAIN is the default, but it is better to have consistent
results.
Jonathan L Long [Fri, 25 Apr 2014 23:44:05 +0000 (16:44 -0700)]
test HingeLossLayer
Based on SoftmaxWithLossLayerTest.
Jonathan L Long [Fri, 25 Apr 2014 23:38:48 +0000 (16:38 -0700)]
make gradient checker's kink use feature absolute value
In theory, layer functions could be nonsmooth anywhere; in all cases in
use so far, they are nonsmooth at either zero or +1 and -1. In the
future, it might be necessary to generalize the kink mechanism beyond
this stopgap measure.
Jonathan L Long [Tue, 8 Apr 2014 03:35:33 +0000 (20:35 -0700)]
add HingeLossLayer for one-vs-all hinge loss
This layer implements a "one-vs-all" hinge loss, (1/n) sum_ij max(0, 1 -
y_ij x_ij), with bottom blob x_ij (i ranging over examples and j over
classes), and y_ij = +1/-1 indicating the label. No regularization is
included, since regularization is done via weight decay or using the
parameters of another layer. The gradient is taken to be zero at the
hinge point. This commit only provides the CPU implementation.
Evan Shelhamer [Fri, 25 Apr 2014 23:02:37 +0000 (16:02 -0700)]
Merge pull request #365 from longjon/pycaffe-empty-nets
Add unary CaffeNet constructor for uninitialized nets
Jonathan L Long [Fri, 25 Apr 2014 21:50:36 +0000 (14:50 -0700)]
pycaffe: add unary CaffeNet constructor for uninitialized nets
Note that parameters are uninitialized, not zero-filled.
Evan Shelhamer [Fri, 25 Apr 2014 19:51:22 +0000 (12:51 -0700)]
Merge pull request #363 from jeffdonahue/speedup-gradient-check
Gradient check speedups
Jeff Donahue [Fri, 25 Apr 2014 03:24:19 +0000 (20:24 -0700)]
eltwise gradient checker
Jeff Donahue [Fri, 25 Apr 2014 04:44:48 +0000 (21:44 -0700)]
move analytic gradient computation outside loop and store -- saves a lot
of time
Evan Shelhamer [Fri, 25 Apr 2014 16:23:22 +0000 (09:23 -0700)]
Merge pull request #364 from niuzhiheng/dev
Update drawnet.py to support new-style net definitions
Evan Shelhamer [Fri, 25 Apr 2014 16:23:00 +0000 (09:23 -0700)]
note pydot dependency of model visualization
ZhiHeng NIU [Fri, 25 Apr 2014 09:43:56 +0000 (17:43 +0800)]
Update the drawnet.py to reflect the recent revised net definition.
Evan Shelhamer [Thu, 24 Apr 2014 16:37:43 +0000 (09:37 -0700)]
Merge pull request #352 from jeffdonahue/solver-seed
Eliminate use of "rand" in code; enable deterministic results via solver seed
Evan Shelhamer [Thu, 24 Apr 2014 16:35:13 +0000 (09:35 -0700)]
Merge pull request #357 from jeffdonahue/test-on-iter-0
Test before doing any training
Jeff Donahue [Thu, 24 Apr 2014 09:05:32 +0000 (02:05 -0700)]
test on "0th iteration" -- before doing any training
Jeff Donahue [Wed, 23 Apr 2014 01:00:22 +0000 (18:00 -0700)]
add caffe/random_fn lint rule to check for use of rand, rand_r, random
Jeff Donahue [Tue, 22 Apr 2014 23:32:40 +0000 (16:32 -0700)]
replace std::shuffle with version using prefetch rng; improve unit test
Jeff Donahue [Tue, 22 Apr 2014 21:12:36 +0000 (14:12 -0700)]
replace remaining uses of rand() with caffe_rng_rand()
Jeff Donahue [Tue, 22 Apr 2014 21:09:37 +0000 (14:09 -0700)]
prefetch_rng in window_data_layer
Jeff Donahue [Tue, 22 Apr 2014 20:56:02 +0000 (13:56 -0700)]
prefetch_rng in ImageDataLayer
Jeff Donahue [Tue, 22 Apr 2014 20:17:05 +0000 (13:17 -0700)]
test scale param
Jeff Donahue [Tue, 22 Apr 2014 20:08:26 +0000 (13:08 -0700)]
make seed test pass by setting up new layer to generate the 2nd sequence
Jeff Donahue [Tue, 22 Apr 2014 19:41:02 +0000 (12:41 -0700)]
cleanup data_layer, add prefetch_rng_ field to it and use instead of
rand -- seeded tests still fail
Jeff Donahue [Tue, 22 Apr 2014 18:34:40 +0000 (11:34 -0700)]
add tests for random crop sequence -- currently both fail
Jeff Donahue [Tue, 22 Apr 2014 11:42:47 +0000 (04:42 -0700)]
add data layer crop tests
Jeff Donahue [Thu, 17 Apr 2014 04:00:30 +0000 (21:00 -0700)]
add random_seed field to SolverParameter and have solver use it --
already works for lenet, doesn't work for imagenet w/ rand() calls
Jeff Donahue [Wed, 23 Apr 2014 06:12:14 +0000 (23:12 -0700)]
add forward tests (via reference impl) for SigmoidCrossEntropyLayer
Evan Shelhamer [Tue, 22 Apr 2014 22:49:20 +0000 (15:49 -0700)]
Merge pull request #350 from longjon/finicky-exhaustive
Make CheckGradientExhaustive fail for topless layers
Jonathan L Long [Tue, 22 Apr 2014 06:41:50 +0000 (23:41 -0700)]
make CheckGradientExhaustive fail for topless layers
Without this commit, it is possible to mistakenly call
CheckGradientExhaustive on a layer with no top blobs (i.e., a loss
layer), causing the gradient check to silently succeed while doing
nothing. With this commit, doing so will cause the test to fail.
Evan Shelhamer [Tue, 22 Apr 2014 05:20:36 +0000 (22:20 -0700)]
Merge pull request #348 from jeffdonahue/datalayer-singleton-bugfix
Fix singleton call in data layers
Note previous crash reports were not in fact due to RNG, although they led to improvements of the RNG code.
Jeff Donahue [Tue, 22 Apr 2014 03:43:39 +0000 (20:43 -0700)]
do the same as prev commit for ImageDataLayer
Jeff Donahue [Tue, 22 Apr 2014 03:35:43 +0000 (20:35 -0700)]
fix bug where DataLayerPrefetch creates its own Caffe singleton, causing
the phase to always be set to TRAIN (always random crops) and RNG failures
Evan Shelhamer [Mon, 21 Apr 2014 03:12:57 +0000 (20:12 -0700)]
Comment current forward/backward responsibilities
Evan Shelhamer [Mon, 21 Apr 2014 03:07:06 +0000 (20:07 -0700)]
Proofread install docs
Evan Shelhamer [Sun, 20 Apr 2014 23:57:56 +0000 (16:57 -0700)]
Merge pull request #339 from sergeyk/dev
Update installation docs
Sergey Karayev [Sun, 20 Apr 2014 20:38:30 +0000 (13:38 -0700)]
installation doc update
Evan Shelhamer [Sat, 19 Apr 2014 06:18:59 +0000 (23:18 -0700)]
Merge pull request #336 from jeffdonahue/fix-rng-segfault
Fix RNG segfault related to #297
Evan Shelhamer [Fri, 18 Apr 2014 18:51:41 +0000 (11:51 -0700)]
fix examples path in mnist leveldb sh
Jeff Donahue [Fri, 18 Apr 2014 18:11:01 +0000 (11:11 -0700)]
Merge pull request #332 from jeffdonahue/share-trained-layers
add ShareTrainedLayersWith method and use for test net in solver
Jeff Donahue [Fri, 18 Apr 2014 17:51:30 +0000 (10:51 -0700)]
remove now unused set_generator and related code
Jeff Donahue [Fri, 18 Apr 2014 17:36:28 +0000 (10:36 -0700)]
pass caffe rng ref into variate_generator constructor instead of having
caffe rng adopt its state
Jeff Donahue [Fri, 18 Apr 2014 01:23:33 +0000 (18:23 -0700)]
remove unnecessary return from void set_generator
Jeff Donahue [Wed, 16 Apr 2014 17:01:30 +0000 (10:01 -0700)]
add ShareTrainedLayersWith method and use for test net in solver
Evan Shelhamer [Wed, 16 Apr 2014 15:46:49 +0000 (09:46 -0600)]
Merge pull request #330 from jeffdonahue/mnist-autoencoder-example
MNIST autoencoder example
Jeff Donahue [Wed, 16 Apr 2014 15:14:27 +0000 (08:14 -0700)]
change to correct next layer id for merge
Jeff Donahue [Wed, 16 Apr 2014 15:13:20 +0000 (08:13 -0700)]
clear sigmoid top vec at initialization
Jeff Donahue [Wed, 16 Apr 2014 05:46:45 +0000 (22:46 -0700)]
add sigmoid cross ent layer unit tests
Jeff Donahue [Wed, 16 Apr 2014 05:26:11 +0000 (22:26 -0700)]
mnist_autoencoder_solver cleanup
Jeff Donahue [Wed, 16 Apr 2014 05:25:46 +0000 (22:25 -0700)]
change lenet dir to 'mnist' in docs
Jeff Donahue [Tue, 15 Apr 2014 22:38:57 +0000 (15:38 -0700)]
make solver able to compute and display test loss
Jeff Donahue [Tue, 15 Apr 2014 22:38:06 +0000 (15:38 -0700)]
mnist autoencoder test proto bugfix: add sigmoid layer before loss
Jeff Donahue [Tue, 15 Apr 2014 22:12:57 +0000 (15:12 -0700)]
enable DataLayer to output unlabeled data
Jeff Donahue [Tue, 15 Apr 2014 21:52:41 +0000 (14:52 -0700)]
add mnist autoencoder example necessities (sigmoid cross entropy loss
layer, sparse gaussian filler)
Jeff Donahue [Tue, 15 Apr 2014 17:56:32 +0000 (10:56 -0700)]
rename lenet dir to mnist
Evan Shelhamer [Tue, 15 Apr 2014 07:37:06 +0000 (01:37 -0600)]
Give choice of ATLAS, MKL, and OpenBLAS (with option to override paths)
- configure build for ATLAS, MKL, or OpenBLAS on Linux and OSX
- allow overriding of the include or lib dirs
- replace magic numbers with BLAS names (atlas, mkl, open)
Follow-up from #305 and #325.
Evan Shelhamer [Tue, 15 Apr 2014 07:36:18 +0000 (01:36 -0600)]
rename python include config var to match lib
Evan Shelhamer [Tue, 15 Apr 2014 07:47:57 +0000 (01:47 -0600)]
Merge pull request #325 from AlOa/OpenBlas
Add possibility to use OpenBlas
AlOa [Mon, 14 Apr 2014 14:12:12 +0000 (16:12 +0200)]
Add possibility to use OpenBlas
Evan Shelhamer [Sun, 13 Apr 2014 01:40:08 +0000 (18:40 -0700)]
Merge pull request #318 from jeffdonahue/blob-copy-by-reference
add Share{Data,Diff} methods to blobs to enable "virtual" copies
- reduce operations by not copying values
- spare memory by shared_ptrs
- simplify SplitLayer logic
Evan Shelhamer [Sun, 13 Apr 2014 01:26:30 +0000 (18:26 -0700)]
Merge pull request #319 from jeffdonahue/sigmoid-optimization
sigmoid layer backward pass optimization: don't recompute forward pass
Jeff Donahue [Sat, 12 Apr 2014 11:16:23 +0000 (04:16 -0700)]
sigmoid layer backward pass optimization: don't recompute forward pass
Jeff Donahue [Sat, 12 Apr 2014 07:52:15 +0000 (00:52 -0700)]
change Adopt -> Share as suggested by kloudkl
Jeff Donahue [Sat, 12 Apr 2014 01:07:08 +0000 (18:07 -0700)]
add Adopt{Data,Diff} methods to blobs to enable "virtual copying"
Jeff Donahue [Fri, 11 Apr 2014 23:07:43 +0000 (16:07 -0700)]
add unit tests for cpu/gpu copy functions
Jeff Donahue [Sat, 12 Apr 2014 06:47:40 +0000 (23:47 -0700)]
change some unnecessary TYPED_TESTs to TEST_Fs
Jeff Donahue [Fri, 11 Apr 2014 23:10:15 +0000 (16:10 -0700)]
fix lint errors by adding 'explicit' to new single arg pycaffe
constructors
Evan Shelhamer [Fri, 11 Apr 2014 05:49:50 +0000 (22:49 -0700)]
polished ignore
Evan Shelhamer [Thu, 10 Apr 2014 00:11:49 +0000 (17:11 -0700)]
Back-merge docs and example image changes from `master` to `dev`