platform/upstream/caffeonacl.git
10 years agoIncorporated Evan’s comments for neuron layers
Sergey Karayev [Mon, 19 May 2014 18:11:37 +0000 (11:11 -0700)]
Incorporated Evan’s comments for neuron layers

10 years agoCosmetic change in ConcatLayer
Sergey Karayev [Mon, 19 May 2014 17:44:21 +0000 (10:44 -0700)]
Cosmetic change in ConcatLayer

10 years agoLil’ more docstring, and cosmetic change in EuclideanLossLayer
Sergey Karayev [Mon, 19 May 2014 17:43:21 +0000 (10:43 -0700)]
Lil’ more docstring, and cosmetic change in EuclideanLossLayer

10 years agofwd/back math docs for neuron layers
Sergey Karayev [Tue, 29 Apr 2014 07:21:15 +0000 (00:21 -0700)]
fwd/back math docs for neuron layers

10 years agoCosmetic change in prep for data layer work
Sergey Karayev [Tue, 29 Apr 2014 02:40:43 +0000 (19:40 -0700)]
Cosmetic change in prep for data layer work

10 years agoSplit all loss layers into own .cpp files
Sergey Karayev [Tue, 29 Apr 2014 02:39:36 +0000 (19:39 -0700)]
Split all loss layers into own .cpp files

10 years agolayer definition reorganization and documentation
Sergey Karayev [Tue, 29 Apr 2014 02:06:07 +0000 (19:06 -0700)]
layer definition reorganization and documentation
- split out neuron, loss, and data layers into own header files
- added LossLayer class with common SetUp checks
- in-progress concise documentation of each layer's purpose

10 years agoMerge pull request #417 from shelhamer/create-and-write-proto
Jeff Donahue [Wed, 14 May 2014 20:05:17 +0000 (13:05 -0700)]
Merge pull request #417 from shelhamer/create-and-write-proto

Write/create/truncate prototxt when saving to fix #341

10 years agofix workaround in net prototxt upgrade
Evan Shelhamer [Wed, 14 May 2014 19:35:33 +0000 (12:35 -0700)]
fix workaround in net prototxt upgrade

10 years agoWrite/create/truncate prototxt when saving to fix #341
Evan Shelhamer [Wed, 14 May 2014 19:28:55 +0000 (12:28 -0700)]
Write/create/truncate prototxt when saving to fix #341

WriteProtoToTextFile now saves the prototxt whether or not the file
already exists and sets the permissions to owner read/write and group +
other read.

Thanks @beam2d and @chyh1990 for pointing out the open modes bug.

10 years agoMerge pull request #414 from shelhamer/net-output-blobs
Evan Shelhamer [Wed, 14 May 2014 01:10:22 +0000 (18:10 -0700)]
Merge pull request #414 from shelhamer/net-output-blobs

Make net know output blob indices

10 years agonet knows output blobs
Evan Shelhamer [Wed, 14 May 2014 01:08:06 +0000 (18:08 -0700)]
net knows output blobs

10 years agoMerge pull request #413 from shelhamer/cublas-status-not-supported
Evan Shelhamer [Tue, 13 May 2014 20:16:19 +0000 (13:16 -0700)]
Merge pull request #413 from shelhamer/cublas-status-not-supported

add cublas status in cuda 6 to fix warning

10 years agoadd cublas status in cuda 6 to fix warning
Evan Shelhamer [Tue, 13 May 2014 19:30:34 +0000 (12:30 -0700)]
add cublas status in cuda 6 to fix warning

...and #define for older CUDAs to not break the build.

10 years agoMerge pull request #406 from jeffdonahue/makefile-include-bug
Jeff Donahue [Sun, 11 May 2014 00:41:39 +0000 (17:41 -0700)]
Merge pull request #406 from jeffdonahue/makefile-include-bug

Fix Makefile header dependency bug

10 years agofix Makefile bug - HXX_SRCS was things that don't end in .hpp, instead
Jeff Donahue [Sun, 11 May 2014 00:20:19 +0000 (17:20 -0700)]
fix Makefile bug - HXX_SRCS was things that don't end in .hpp, instead
of things that do...

10 years agoMerge pull request #403 from jeffdonahue/solver-mode-enum
Evan Shelhamer [Fri, 9 May 2014 21:56:13 +0000 (14:56 -0700)]
Merge pull request #403 from jeffdonahue/solver-mode-enum

Make solver_mode an enum: CPU or GPU

10 years agomake solver_mode an enum with CPU and GPU -- fully backwards compatible
Jeff Donahue [Fri, 9 May 2014 21:50:05 +0000 (14:50 -0700)]
make solver_mode an enum with CPU and GPU -- fully backwards compatible
with old 0/1 style

10 years agoMerge pull request #396 from longjon/math-includes
Jeff Donahue [Wed, 7 May 2014 04:34:50 +0000 (21:34 -0700)]
Merge pull request #396 from longjon/math-includes

Improve includes in util/math_functions.hpp

10 years agoimprove includes in util/math_function.hpp
Jonathan L Long [Tue, 6 May 2014 23:40:09 +0000 (16:40 -0700)]
improve includes in util/math_function.hpp

This commit removes the redundant <math.h>, and adds the necessary
<stdint.h> and <glog/logging.h>.

10 years agonote bug in cifar10 full with CPU computation
Jeff Donahue [Tue, 6 May 2014 18:59:25 +0000 (11:59 -0700)]
note bug in cifar10 full with CPU computation

10 years agoMerge pull request #294 from longjon/memory-data-layer
Evan Shelhamer [Fri, 2 May 2014 21:11:31 +0000 (14:11 -0700)]
Merge pull request #294 from longjon/memory-data-layer

Add a layer for in-memory data, and expose it to Python

10 years agofix lint error in syncedmem.hpp
Jonathan L Long [Fri, 25 Apr 2014 21:56:44 +0000 (14:56 -0700)]
fix lint error in syncedmem.hpp

10 years agopycaffe: allow 1d labels to be passed to set_input_arrays
Jonathan L Long [Thu, 17 Apr 2014 10:06:20 +0000 (03:06 -0700)]
pycaffe: allow 1d labels to be passed to set_input_arrays

10 years agopycaffe: add Net.set_input_arrays for input from numpy
Jonathan L Long [Fri, 25 Apr 2014 21:33:25 +0000 (14:33 -0700)]
pycaffe: add Net.set_input_arrays for input from numpy

This requires a net whose first layer is a MemoryDataLayer.

10 years agopycaffe: store a shared_ptr<CaffeNet> in SGDSolver
Jonathan L Long [Fri, 25 Apr 2014 21:29:50 +0000 (14:29 -0700)]
pycaffe: store a shared_ptr<CaffeNet> in SGDSolver

Doing this, rather than constructing the CaffeNet wrapper every time,
will allow the wrapper to hold references that last at least as long as
SGDSolver (which will be necessary to ensure that data used by
MemoryDataLayer doesn't get freed).

10 years agopycaffe: let boost pass shared_ptr<CaffeNet>
Jonathan L Long [Fri, 25 Apr 2014 21:09:02 +0000 (14:09 -0700)]
pycaffe: let boost pass shared_ptr<CaffeNet>

10 years agoadd basic tests for MemoryDataLayer
Jonathan L Long [Thu, 17 Apr 2014 09:24:02 +0000 (02:24 -0700)]
add basic tests for MemoryDataLayer

10 years agoadd size accessors to MemoryDataLayer
Jonathan L Long [Fri, 4 Apr 2014 22:20:02 +0000 (15:20 -0700)]
add size accessors to MemoryDataLayer

This will facilitate input size checking for pycaffe (and potentially
others).

10 years agoadd MemoryDataLayer for reading input from contiguous blocks of memory
Jonathan L Long [Fri, 4 Apr 2014 02:27:21 +0000 (19:27 -0700)]
add MemoryDataLayer for reading input from contiguous blocks of memory

10 years agoadd set_cpu_data to Blob and SyncedMemory
Jonathan L Long [Fri, 4 Apr 2014 02:19:47 +0000 (19:19 -0700)]
add set_cpu_data to Blob and SyncedMemory

This allows a blob to be updated without copy to use already existing
memory (and will support MemoryDataLayer).

10 years agoMerge pull request #378 from longjon/no-merge-duplicates
Jeff Donahue [Wed, 30 Apr 2014 19:53:32 +0000 (12:53 -0700)]
Merge pull request #378 from longjon/no-merge-duplicates

Note the last added layer/params in caffe.proto to prevent conflicts

10 years agonote the last added layer/params in caffe.proto to prevent conflicts
Jonathan L Long [Wed, 30 Apr 2014 19:28:55 +0000 (12:28 -0700)]
note the last added layer/params in caffe.proto to prevent conflicts

The current scheme does not actually prevent conflicts, since three-way
merge will accept simultaneous changes that agree on the next number.
This commit fixes this by explicitly noting the last layer added.

10 years agoMerge pull request #377 from sguada/fix_initial_test
Evan Shelhamer [Wed, 30 Apr 2014 18:37:56 +0000 (11:37 -0700)]
Merge pull request #377 from sguada/fix_initial_test

Keep same format for all net testing log messages

10 years agoKeep uniform test messages for all the test
Sergio Guadarrama [Wed, 30 Apr 2014 18:26:55 +0000 (11:26 -0700)]
Keep uniform test messages for all the test

10 years agorollback 8368818, does not build
Jeff Donahue [Tue, 29 Apr 2014 04:50:00 +0000 (21:50 -0700)]
rollback 8368818, does not build

10 years agoAdd Sublime Text project settings to gitignore
Sergey Karayev [Tue, 29 Apr 2014 02:47:17 +0000 (19:47 -0700)]
Add Sublime Text project settings to gitignore

10 years agoHandling CUBLAS_STATUS_NOT_SUPPORTED to suppress warning
Sergey Karayev [Tue, 29 Apr 2014 02:44:32 +0000 (19:44 -0700)]
Handling CUBLAS_STATUS_NOT_SUPPORTED to suppress warning

10 years agoMerge pull request #370 from shelhamer/test-net-polish
Jeff Donahue [Sat, 26 Apr 2014 23:46:39 +0000 (16:46 -0700)]
Merge pull request #370 from shelhamer/test-net-polish

Polish test_net

10 years agofix test_net to upgrade params from v0 if needed
Evan Shelhamer [Sat, 26 Apr 2014 23:33:40 +0000 (16:33 -0700)]
fix test_net to upgrade params from v0 if needed

10 years agodefault test net device to 0 and log device chosen
Evan Shelhamer [Sat, 26 Apr 2014 23:08:22 +0000 (16:08 -0700)]
default test net device to 0 and log device chosen

10 years agoset seed in neuron and power layer tests for deterministic results
Jeff Donahue [Sat, 26 Apr 2014 22:47:35 +0000 (15:47 -0700)]
set seed in neuron and power layer tests for deterministic results

10 years agoMerge pull request #367 from shelhamer/randomize-test-order
Jeff Donahue [Sat, 26 Apr 2014 22:45:11 +0000 (15:45 -0700)]
Merge pull request #367 from shelhamer/randomize-test-order

Randomize order of test execution by `make runtest`

10 years agoproofreading and trivial polish
Evan Shelhamer [Sat, 26 Apr 2014 22:23:56 +0000 (15:23 -0700)]
proofreading and trivial polish

10 years agoadd device id arg to test_net (fix #232)
Evan Shelhamer [Sat, 26 Apr 2014 22:22:42 +0000 (15:22 -0700)]
add device id arg to test_net (fix #232)

10 years agoMerge pull request #303 from longjon/hinge-loss-layer
Jeff Donahue [Sat, 26 Apr 2014 22:13:15 +0000 (15:13 -0700)]
Merge pull request #303 from longjon/hinge-loss-layer

HingeLossLayer

10 years agoMerge pull request #368 from shelhamer/fix-data-layer-sequence-tests
Jeff Donahue [Sat, 26 Apr 2014 22:09:17 +0000 (15:09 -0700)]
Merge pull request #368 from shelhamer/fix-data-layer-sequence-tests

Fix data layer sequence tests to avoid leveldb contention and lock failures

10 years agoscope data layer sequence tests to avoid leveldb contention
Evan Shelhamer [Sat, 26 Apr 2014 21:57:11 +0000 (14:57 -0700)]
scope data layer sequence tests to avoid leveldb contention

10 years agoset phase in dropout tests
Evan Shelhamer [Sat, 26 Apr 2014 21:42:38 +0000 (14:42 -0700)]
set phase in dropout tests

10 years agorandomize order of test execution by make runtest
Evan Shelhamer [Sat, 26 Apr 2014 01:54:12 +0000 (18:54 -0700)]
randomize order of test execution by make runtest

10 years agoMerge pull request #366 from shelhamer/test-phase-fix
Jeff Donahue [Sat, 26 Apr 2014 01:42:37 +0000 (18:42 -0700)]
Merge pull request #366 from shelhamer/test-phase-fix

set TRAIN in CommonTest.TestPhase

10 years agoset TRAIN in CommonTest.TestPhase
Evan Shelhamer [Sat, 26 Apr 2014 01:33:40 +0000 (18:33 -0700)]
set TRAIN in CommonTest.TestPhase

...to avoid effects of test execution order. The original intention was
to check that TRAIN is the default, but it is better to have consistent
results.

10 years agotest HingeLossLayer
Jonathan L Long [Fri, 25 Apr 2014 23:44:05 +0000 (16:44 -0700)]
test HingeLossLayer

Based on SoftmaxWithLossLayerTest.

10 years agomake gradient checker's kink use feature absolute value
Jonathan L Long [Fri, 25 Apr 2014 23:38:48 +0000 (16:38 -0700)]
make gradient checker's kink use feature absolute value

In theory, layer functions could be nonsmooth anywhere; in all cases in
use so far, they are nonsmooth at either zero or +1 and -1. In the
future, it might be necessary to generalize the kink mechanism beyond
this stopgap measure.

10 years agoadd HingeLossLayer for one-vs-all hinge loss
Jonathan L Long [Tue, 8 Apr 2014 03:35:33 +0000 (20:35 -0700)]
add HingeLossLayer for one-vs-all hinge loss

This layer implements a "one-vs-all" hinge loss, (1/n) sum_ij max(0, 1 -
y_ij x_ij), with bottom blob x_ij (i ranging over examples and j over
classes), and y_ij = +1/-1 indicating the label. No regularization is
included, since regularization is done via weight decay or using the
parameters of another layer. The gradient is taken to be zero at the
hinge point. This commit only provides the CPU implementation.

10 years agoMerge pull request #365 from longjon/pycaffe-empty-nets
Evan Shelhamer [Fri, 25 Apr 2014 23:02:37 +0000 (16:02 -0700)]
Merge pull request #365 from longjon/pycaffe-empty-nets

Add unary CaffeNet constructor for uninitialized nets

10 years agopycaffe: add unary CaffeNet constructor for uninitialized nets
Jonathan L Long [Fri, 25 Apr 2014 21:50:36 +0000 (14:50 -0700)]
pycaffe: add unary CaffeNet constructor for uninitialized nets

Note that parameters are uninitialized, not zero-filled.

10 years agoMerge pull request #363 from jeffdonahue/speedup-gradient-check
Evan Shelhamer [Fri, 25 Apr 2014 19:51:22 +0000 (12:51 -0700)]
Merge pull request #363 from jeffdonahue/speedup-gradient-check

Gradient check speedups

10 years agoeltwise gradient checker
Jeff Donahue [Fri, 25 Apr 2014 03:24:19 +0000 (20:24 -0700)]
eltwise gradient checker

10 years agomove analytic gradient computation outside loop and store -- saves a lot
Jeff Donahue [Fri, 25 Apr 2014 04:44:48 +0000 (21:44 -0700)]
move analytic gradient computation outside loop and store -- saves a lot
of time

10 years agoMerge pull request #364 from niuzhiheng/dev
Evan Shelhamer [Fri, 25 Apr 2014 16:23:22 +0000 (09:23 -0700)]
Merge pull request #364 from niuzhiheng/dev

Update drawnet.py to support new-style net definitions

10 years agonote pydot dependency of model visualization
Evan Shelhamer [Fri, 25 Apr 2014 16:23:00 +0000 (09:23 -0700)]
note pydot dependency of model visualization

10 years agoUpdate the drawnet.py to reflect the recent revised net definition.
ZhiHeng NIU [Fri, 25 Apr 2014 09:43:56 +0000 (17:43 +0800)]
Update the drawnet.py to reflect the recent revised net definition.

10 years agoMerge pull request #352 from jeffdonahue/solver-seed
Evan Shelhamer [Thu, 24 Apr 2014 16:37:43 +0000 (09:37 -0700)]
Merge pull request #352 from jeffdonahue/solver-seed

Eliminate use of "rand" in code; enable deterministic results via solver seed

10 years agoMerge pull request #357 from jeffdonahue/test-on-iter-0
Evan Shelhamer [Thu, 24 Apr 2014 16:35:13 +0000 (09:35 -0700)]
Merge pull request #357 from jeffdonahue/test-on-iter-0

Test before doing any training

10 years agotest on "0th iteration" -- before doing any training
Jeff Donahue [Thu, 24 Apr 2014 09:05:32 +0000 (02:05 -0700)]
test on "0th iteration" -- before doing any training

10 years agoadd caffe/random_fn lint rule to check for use of rand, rand_r, random
Jeff Donahue [Wed, 23 Apr 2014 01:00:22 +0000 (18:00 -0700)]
add caffe/random_fn lint rule to check for use of rand, rand_r, random

10 years agoreplace std::shuffle with version using prefetch rng; improve unit test
Jeff Donahue [Tue, 22 Apr 2014 23:32:40 +0000 (16:32 -0700)]
replace std::shuffle with version using prefetch rng; improve unit test

10 years agoreplace remaining uses of rand() with caffe_rng_rand()
Jeff Donahue [Tue, 22 Apr 2014 21:12:36 +0000 (14:12 -0700)]
replace remaining uses of rand() with caffe_rng_rand()

10 years agoprefetch_rng in window_data_layer
Jeff Donahue [Tue, 22 Apr 2014 21:09:37 +0000 (14:09 -0700)]
prefetch_rng in window_data_layer

10 years agoprefetch_rng in ImageDataLayer
Jeff Donahue [Tue, 22 Apr 2014 20:56:02 +0000 (13:56 -0700)]
prefetch_rng in ImageDataLayer

10 years agotest scale param
Jeff Donahue [Tue, 22 Apr 2014 20:17:05 +0000 (13:17 -0700)]
test scale param

10 years agomake seed test pass by setting up new layer to generate the 2nd sequence
Jeff Donahue [Tue, 22 Apr 2014 20:08:26 +0000 (13:08 -0700)]
make seed test pass by setting up new layer to generate the 2nd sequence

10 years agocleanup data_layer, add prefetch_rng_ field to it and use instead of
Jeff Donahue [Tue, 22 Apr 2014 19:41:02 +0000 (12:41 -0700)]
cleanup data_layer, add prefetch_rng_ field to it and use instead of
rand -- seeded tests still fail

10 years agoadd tests for random crop sequence -- currently both fail
Jeff Donahue [Tue, 22 Apr 2014 18:34:40 +0000 (11:34 -0700)]
add tests for random crop sequence -- currently both fail

10 years agoadd data layer crop tests
Jeff Donahue [Tue, 22 Apr 2014 11:42:47 +0000 (04:42 -0700)]
add data layer crop tests

10 years agoadd random_seed field to SolverParameter and have solver use it --
Jeff Donahue [Thu, 17 Apr 2014 04:00:30 +0000 (21:00 -0700)]
add random_seed field to SolverParameter and have solver use it --
already works for lenet, doesn't work for imagenet w/ rand() calls

10 years agoadd forward tests (via reference impl) for SigmoidCrossEntropyLayer
Jeff Donahue [Wed, 23 Apr 2014 06:12:14 +0000 (23:12 -0700)]
add forward tests (via reference impl) for SigmoidCrossEntropyLayer

10 years agoMerge pull request #350 from longjon/finicky-exhaustive
Evan Shelhamer [Tue, 22 Apr 2014 22:49:20 +0000 (15:49 -0700)]
Merge pull request #350 from longjon/finicky-exhaustive

Make CheckGradientExhaustive fail for topless layers

10 years agomake CheckGradientExhaustive fail for topless layers
Jonathan L Long [Tue, 22 Apr 2014 06:41:50 +0000 (23:41 -0700)]
make CheckGradientExhaustive fail for topless layers

Without this commit, it is possible to mistakenly call
CheckGradientExhaustive on a layer with no top blobs (i.e., a loss
layer), causing the gradient check to silently succeed while doing
nothing. With this commit, doing so will cause the test to fail.

10 years agoMerge pull request #348 from jeffdonahue/datalayer-singleton-bugfix
Evan Shelhamer [Tue, 22 Apr 2014 05:20:36 +0000 (22:20 -0700)]
Merge pull request #348 from jeffdonahue/datalayer-singleton-bugfix

Fix singleton call in data layers

Note previous crash reports were not in fact due to RNG, although they led to improvements of the RNG code.

10 years agodo the same as prev commit for ImageDataLayer
Jeff Donahue [Tue, 22 Apr 2014 03:43:39 +0000 (20:43 -0700)]
do the same as prev commit for ImageDataLayer

10 years agofix bug where DataLayerPrefetch creates its own Caffe singleton, causing
Jeff Donahue [Tue, 22 Apr 2014 03:35:43 +0000 (20:35 -0700)]
fix bug where DataLayerPrefetch creates its own Caffe singleton, causing
the phase to always be set to TRAIN (always random crops) and RNG failures

10 years agoComment current forward/backward responsibilities
Evan Shelhamer [Mon, 21 Apr 2014 03:12:57 +0000 (20:12 -0700)]
Comment current forward/backward responsibilities

10 years agoProofread install docs
Evan Shelhamer [Mon, 21 Apr 2014 03:07:06 +0000 (20:07 -0700)]
Proofread install docs

10 years agoMerge pull request #339 from sergeyk/dev
Evan Shelhamer [Sun, 20 Apr 2014 23:57:56 +0000 (16:57 -0700)]
Merge pull request #339 from sergeyk/dev

Update installation docs

10 years agoinstallation doc update
Sergey Karayev [Sun, 20 Apr 2014 20:38:30 +0000 (13:38 -0700)]
installation doc update

10 years agoMerge pull request #336 from jeffdonahue/fix-rng-segfault
Evan Shelhamer [Sat, 19 Apr 2014 06:18:59 +0000 (23:18 -0700)]
Merge pull request #336 from jeffdonahue/fix-rng-segfault

Fix RNG segfault related to #297

10 years agofix examples path in mnist leveldb sh
Evan Shelhamer [Fri, 18 Apr 2014 18:51:41 +0000 (11:51 -0700)]
fix examples path in mnist leveldb sh

10 years agoMerge pull request #332 from jeffdonahue/share-trained-layers
Jeff Donahue [Fri, 18 Apr 2014 18:11:01 +0000 (11:11 -0700)]
Merge pull request #332 from jeffdonahue/share-trained-layers

add ShareTrainedLayersWith method and use for test net in solver

10 years agoremove now unused set_generator and related code
Jeff Donahue [Fri, 18 Apr 2014 17:51:30 +0000 (10:51 -0700)]
remove now unused set_generator and related code

10 years agopass caffe rng ref into variate_generator constructor instead of having
Jeff Donahue [Fri, 18 Apr 2014 17:36:28 +0000 (10:36 -0700)]
pass caffe rng ref into variate_generator constructor instead of having
caffe rng adopt its state

10 years agoremove unnecessary return from void set_generator
Jeff Donahue [Fri, 18 Apr 2014 01:23:33 +0000 (18:23 -0700)]
remove unnecessary return from void set_generator

10 years agoadd ShareTrainedLayersWith method and use for test net in solver
Jeff Donahue [Wed, 16 Apr 2014 17:01:30 +0000 (10:01 -0700)]
add ShareTrainedLayersWith method and use for test net in solver

10 years agoMerge pull request #330 from jeffdonahue/mnist-autoencoder-example
Evan Shelhamer [Wed, 16 Apr 2014 15:46:49 +0000 (09:46 -0600)]
Merge pull request #330 from jeffdonahue/mnist-autoencoder-example

MNIST autoencoder example

10 years agochange to correct next layer id for merge
Jeff Donahue [Wed, 16 Apr 2014 15:14:27 +0000 (08:14 -0700)]
change to correct next layer id for merge

10 years agoclear sigmoid top vec at initialization
Jeff Donahue [Wed, 16 Apr 2014 15:13:20 +0000 (08:13 -0700)]
clear sigmoid top vec at initialization

10 years agoadd sigmoid cross ent layer unit tests
Jeff Donahue [Wed, 16 Apr 2014 05:46:45 +0000 (22:46 -0700)]
add sigmoid cross ent layer unit tests

10 years agomnist_autoencoder_solver cleanup
Jeff Donahue [Wed, 16 Apr 2014 05:26:11 +0000 (22:26 -0700)]
mnist_autoencoder_solver cleanup

10 years agochange lenet dir to 'mnist' in docs
Jeff Donahue [Wed, 16 Apr 2014 05:25:46 +0000 (22:25 -0700)]
change lenet dir to 'mnist' in docs