Sergey Karayev [Fri, 23 May 2014 01:18:25 +0000 (18:18 -0700)]
link to demo
Evan Shelhamer [Thu, 22 May 2014 08:04:36 +0000 (01:04 -0700)]
Merge pull request #435 from shelhamer/v1-models
Release v1 model defs + weights
Evan Shelhamer [Thu, 22 May 2014 08:02:56 +0000 (01:02 -0700)]
fix draw_net python script
include caffe.draw for drawing functions.
Evan Shelhamer [Thu, 22 May 2014 07:56:35 +0000 (00:56 -0700)]
release v1 model defs + weights
- Caffe reference ImageNet model
- AlexNet
Note that one can upgrade the weights locally by
`upgrade_net_proto_binary.bin` to avoid re-downloading.
Evan Shelhamer [Thu, 22 May 2014 06:43:49 +0000 (23:43 -0700)]
point out @niuzhiheng's work on the Windows port
Evan Shelhamer [Wed, 21 May 2014 17:52:45 +0000 (10:52 -0700)]
fix test_all path in docs
Evan Shelhamer [Tue, 20 May 2014 21:44:47 +0000 (14:44 -0700)]
link canonical bvlc site
Evan Shelhamer [Tue, 20 May 2014 21:42:37 +0000 (14:42 -0700)]
fix detection notebook link
Evan Shelhamer [Tue, 20 May 2014 21:20:15 +0000 (14:20 -0700)]
Merge pull request #429 from shelhamer/next
Next: 0.999
Evan Shelhamer [Tue, 20 May 2014 19:44:51 +0000 (12:44 -0700)]
Back-merge changes in master
* master:
bundle presentation in gh-pages for now...
fix typo pointed out by @yinxusen
note support for non-MKL installation in dev
include pretrained snapshot and performance details
Document AlexNet model, include download script
define AlexNet architecture
polished ignore
Evan Shelhamer [Tue, 20 May 2014 19:20:00 +0000 (12:20 -0700)]
Merge pull request #311 from shelhamer/python-fixes
Improve python wrapper
Evan Shelhamer [Tue, 20 May 2014 08:04:17 +0000 (01:04 -0700)]
update notebook examples with new wrapper usage, re-organize
Evan Shelhamer [Tue, 20 May 2014 18:41:42 +0000 (11:41 -0700)]
preprocess single inputs instead of lists
For compositionality and expectations.
Evan Shelhamer [Tue, 20 May 2014 06:50:15 +0000 (23:50 -0700)]
windowed detection in python
Evan Shelhamer [Tue, 20 May 2014 05:55:50 +0000 (22:55 -0700)]
squash infuriating loop assignment bug in batching
Evan Shelhamer [Mon, 19 May 2014 22:31:49 +0000 (15:31 -0700)]
image classification in python
Evan Shelhamer [Mon, 19 May 2014 01:25:18 +0000 (18:25 -0700)]
fix padding for the last batch
Evan Shelhamer [Mon, 19 May 2014 00:14:53 +0000 (17:14 -0700)]
split drawnet into module code and script
Don't run scripts in the module dir to avoid import collisions between
io and caffe.io.
Evan Shelhamer [Mon, 19 May 2014 00:13:05 +0000 (17:13 -0700)]
add caffe.io submodule for conversions, image loading and resizing
Evan Shelhamer [Mon, 19 May 2014 00:11:38 +0000 (17:11 -0700)]
fix python mean subtraction
Sergey Karayev [Mon, 19 May 2014 22:50:33 +0000 (15:50 -0700)]
Merge pull request #376 from sergeyk/layer_reorg
Layer definitions and declarations re-organization and documentation
Sergey Karayev [Mon, 19 May 2014 18:11:37 +0000 (11:11 -0700)]
Incorporated Evan’s comments for neuron layers
Sergey Karayev [Mon, 19 May 2014 17:44:21 +0000 (10:44 -0700)]
Cosmetic change in ConcatLayer
Sergey Karayev [Mon, 19 May 2014 17:43:21 +0000 (10:43 -0700)]
Lil’ more docstring, and cosmetic change in EuclideanLossLayer
Sergey Karayev [Tue, 29 Apr 2014 07:21:15 +0000 (00:21 -0700)]
fwd/back math docs for neuron layers
Evan Shelhamer [Fri, 16 May 2014 23:03:55 +0000 (16:03 -0700)]
drop cute names in favor of Net.{pre,de}process() for input formatting
...and refer to inputs as inputs and not images since general vectors
and matrices are perfectly fine.
Evan Shelhamer [Fri, 16 May 2014 01:52:07 +0000 (18:52 -0700)]
Net.caffeinate() and Net.decaffeinate() format/unformat lists
Evan Shelhamer [Fri, 16 May 2014 01:14:45 +0000 (18:14 -0700)]
take blob args as ndarrays and assign on the python side
Take blob args and give blob returns as single ndarrays instead of lists
of arrays.
Assign the net blobs and diffs as needed on the python side, which
reduces copies and simplifies the C++ side of the wrapper.
Thanks @longjon for the suggestion.
Sergey Karayev [Tue, 29 Apr 2014 02:40:43 +0000 (19:40 -0700)]
Cosmetic change in prep for data layer work
Sergey Karayev [Tue, 29 Apr 2014 02:39:36 +0000 (19:39 -0700)]
Split all loss layers into own .cpp files
Sergey Karayev [Tue, 29 Apr 2014 02:06:07 +0000 (19:06 -0700)]
layer definition reorganization and documentation
- split out neuron, loss, and data layers into own header files
- added LossLayer class with common SetUp checks
- in-progress concise documentation of each layer's purpose
Evan Shelhamer [Thu, 15 May 2014 20:52:07 +0000 (13:52 -0700)]
resize to input dimensions when formatting in python
Evan Shelhamer [Thu, 15 May 2014 20:06:17 +0000 (13:06 -0700)]
replace iterator with indices for consistency
Evan Shelhamer [Thu, 15 May 2014 19:35:32 +0000 (12:35 -0700)]
python style
Evan Shelhamer [Thu, 15 May 2014 17:20:43 +0000 (10:20 -0700)]
fix accidental revert of Init() from
f5c28581
Evan Shelhamer [Thu, 15 May 2014 06:45:33 +0000 (23:45 -0700)]
batch inputs in python by forward_all() and forward_backward_all()
Evan Shelhamer [Thu, 15 May 2014 03:17:09 +0000 (20:17 -0700)]
don't squeeze blob arrays for python
Preserve the non-batch dimensions of blob arrays, even for singletons.
The forward() and backward() helpers take lists of ndarrays instead of a
single ndarray per blob, and lists of ndarrays are likewise returned.
Note that for output the blob array could actually be returned as a
single ndarray instead of a list.
Evan Shelhamer [Thu, 15 May 2014 00:38:33 +0000 (17:38 -0700)]
python forward() and backward() extract any blobs and diffs
Evan Shelhamer [Wed, 14 May 2014 21:37:33 +0000 (14:37 -0700)]
python Net.backward() helper and Net.BackwardPrefilled()
Evan Shelhamer [Wed, 14 May 2014 21:02:54 +0000 (14:02 -0700)]
bad forward/backward inputs throw exceptions instead of crashing python
Evan Shelhamer [Wed, 14 May 2014 20:39:06 +0000 (13:39 -0700)]
pycaffe Net.forward() helper
Do forward pass by prefilled or packaging input + output blobs and
returning a {output blob name: output list} dict.
Evan Shelhamer [Wed, 14 May 2014 02:56:14 +0000 (19:56 -0700)]
set input preprocessing per blob in python
Evan Shelhamer [Wed, 14 May 2014 01:53:36 +0000 (18:53 -0700)]
expose input and output blob names to python as lists
Jeff Donahue [Wed, 14 May 2014 20:05:17 +0000 (13:05 -0700)]
Merge pull request #417 from shelhamer/create-and-write-proto
Write/create/truncate prototxt when saving to fix #341
Evan Shelhamer [Wed, 14 May 2014 19:35:33 +0000 (12:35 -0700)]
fix workaround in net prototxt upgrade
Evan Shelhamer [Wed, 14 May 2014 19:28:55 +0000 (12:28 -0700)]
Write/create/truncate prototxt when saving to fix #341
WriteProtoToTextFile now saves the prototxt whether or not the file
already exists and sets the permissions to owner read/write and group +
other read.
Thanks @beam2d and @chyh1990 for pointing out the open modes bug.
Evan Shelhamer [Thu, 10 Apr 2014 03:27:58 +0000 (20:27 -0700)]
pycaffe comments, lint
Evan Shelhamer [Thu, 10 Apr 2014 01:37:32 +0000 (18:37 -0700)]
add python io getters, mean helper, and image caffeinator/decaffeinator
Evan Shelhamer [Thu, 10 Apr 2014 01:33:18 +0000 (18:33 -0700)]
make python wrapper mean match binaryproto dimensions
ilsvrc_2012_mean.npy has dims K x H x W.
Code written for the old D x D x K mean needs to be rewritten!
Evan Shelhamer [Wed, 9 Apr 2014 21:02:40 +0000 (14:02 -0700)]
match existing python formatting
Evan Shelhamer [Wed, 14 May 2014 01:10:22 +0000 (18:10 -0700)]
Merge pull request #414 from shelhamer/net-output-blobs
Make net know output blob indices
Evan Shelhamer [Wed, 14 May 2014 01:08:06 +0000 (18:08 -0700)]
net knows output blobs
Evan Shelhamer [Tue, 13 May 2014 20:16:19 +0000 (13:16 -0700)]
Merge pull request #413 from shelhamer/cublas-status-not-supported
add cublas status in cuda 6 to fix warning
Evan Shelhamer [Tue, 13 May 2014 19:30:34 +0000 (12:30 -0700)]
add cublas status in cuda 6 to fix warning
...and #define for older CUDAs to not break the build.
Jeff Donahue [Sun, 11 May 2014 00:41:39 +0000 (17:41 -0700)]
Merge pull request #406 from jeffdonahue/makefile-include-bug
Fix Makefile header dependency bug
Jeff Donahue [Sun, 11 May 2014 00:20:19 +0000 (17:20 -0700)]
fix Makefile bug - HXX_SRCS was things that don't end in .hpp, instead
of things that do...
Evan Shelhamer [Fri, 9 May 2014 21:56:13 +0000 (14:56 -0700)]
Merge pull request #403 from jeffdonahue/solver-mode-enum
Make solver_mode an enum: CPU or GPU
Jeff Donahue [Fri, 9 May 2014 21:50:05 +0000 (14:50 -0700)]
make solver_mode an enum with CPU and GPU -- fully backwards compatible
with old 0/1 style
Jeff Donahue [Wed, 7 May 2014 04:34:50 +0000 (21:34 -0700)]
Merge pull request #396 from longjon/math-includes
Improve includes in util/math_functions.hpp
Jonathan L Long [Tue, 6 May 2014 23:40:09 +0000 (16:40 -0700)]
improve includes in util/math_function.hpp
This commit removes the redundant <math.h>, and adds the necessary
<stdint.h> and <glog/logging.h>.
Jeff Donahue [Tue, 6 May 2014 18:59:25 +0000 (11:59 -0700)]
note bug in cifar10 full with CPU computation
Evan Shelhamer [Tue, 6 May 2014 17:04:13 +0000 (10:04 -0700)]
bundle presentation in gh-pages for now...
...dropbox disabled our link. Actual solution: configure web server for
models and other materials soon.
Evan Shelhamer [Fri, 2 May 2014 21:11:31 +0000 (14:11 -0700)]
Merge pull request #294 from longjon/memory-data-layer
Add a layer for in-memory data, and expose it to Python
Jonathan L Long [Fri, 25 Apr 2014 21:56:44 +0000 (14:56 -0700)]
fix lint error in syncedmem.hpp
Jonathan L Long [Thu, 17 Apr 2014 10:06:20 +0000 (03:06 -0700)]
pycaffe: allow 1d labels to be passed to set_input_arrays
Jonathan L Long [Fri, 25 Apr 2014 21:33:25 +0000 (14:33 -0700)]
pycaffe: add Net.set_input_arrays for input from numpy
This requires a net whose first layer is a MemoryDataLayer.
Jonathan L Long [Fri, 25 Apr 2014 21:29:50 +0000 (14:29 -0700)]
pycaffe: store a shared_ptr<CaffeNet> in SGDSolver
Doing this, rather than constructing the CaffeNet wrapper every time,
will allow the wrapper to hold references that last at least as long as
SGDSolver (which will be necessary to ensure that data used by
MemoryDataLayer doesn't get freed).
Jonathan L Long [Fri, 25 Apr 2014 21:09:02 +0000 (14:09 -0700)]
pycaffe: let boost pass shared_ptr<CaffeNet>
Jonathan L Long [Thu, 17 Apr 2014 09:24:02 +0000 (02:24 -0700)]
add basic tests for MemoryDataLayer
Jonathan L Long [Fri, 4 Apr 2014 22:20:02 +0000 (15:20 -0700)]
add size accessors to MemoryDataLayer
This will facilitate input size checking for pycaffe (and potentially
others).
Jonathan L Long [Fri, 4 Apr 2014 02:27:21 +0000 (19:27 -0700)]
add MemoryDataLayer for reading input from contiguous blocks of memory
Jonathan L Long [Fri, 4 Apr 2014 02:19:47 +0000 (19:19 -0700)]
add set_cpu_data to Blob and SyncedMemory
This allows a blob to be updated without copy to use already existing
memory (and will support MemoryDataLayer).
Jeff Donahue [Wed, 30 Apr 2014 19:53:32 +0000 (12:53 -0700)]
Merge pull request #378 from longjon/no-merge-duplicates
Note the last added layer/params in caffe.proto to prevent conflicts
Jonathan L Long [Wed, 30 Apr 2014 19:28:55 +0000 (12:28 -0700)]
note the last added layer/params in caffe.proto to prevent conflicts
The current scheme does not actually prevent conflicts, since three-way
merge will accept simultaneous changes that agree on the next number.
This commit fixes this by explicitly noting the last layer added.
Evan Shelhamer [Wed, 30 Apr 2014 18:37:56 +0000 (11:37 -0700)]
Merge pull request #377 from sguada/fix_initial_test
Keep same format for all net testing log messages
Sergio Guadarrama [Wed, 30 Apr 2014 18:26:55 +0000 (11:26 -0700)]
Keep uniform test messages for all the test
Jeff Donahue [Tue, 29 Apr 2014 04:50:00 +0000 (21:50 -0700)]
rollback 8368818, does not build
Sergey Karayev [Tue, 29 Apr 2014 02:47:17 +0000 (19:47 -0700)]
Add Sublime Text project settings to gitignore
Sergey Karayev [Tue, 29 Apr 2014 02:44:32 +0000 (19:44 -0700)]
Handling CUBLAS_STATUS_NOT_SUPPORTED to suppress warning
Jeff Donahue [Sat, 26 Apr 2014 23:46:39 +0000 (16:46 -0700)]
Merge pull request #370 from shelhamer/test-net-polish
Polish test_net
Evan Shelhamer [Sat, 26 Apr 2014 23:33:40 +0000 (16:33 -0700)]
fix test_net to upgrade params from v0 if needed
Evan Shelhamer [Sat, 26 Apr 2014 23:08:22 +0000 (16:08 -0700)]
default test net device to 0 and log device chosen
Jeff Donahue [Sat, 26 Apr 2014 22:47:35 +0000 (15:47 -0700)]
set seed in neuron and power layer tests for deterministic results
Jeff Donahue [Sat, 26 Apr 2014 22:45:11 +0000 (15:45 -0700)]
Merge pull request #367 from shelhamer/randomize-test-order
Randomize order of test execution by `make runtest`
Evan Shelhamer [Sat, 26 Apr 2014 22:23:56 +0000 (15:23 -0700)]
proofreading and trivial polish
Evan Shelhamer [Sat, 26 Apr 2014 22:22:42 +0000 (15:22 -0700)]
add device id arg to test_net (fix #232)
Jeff Donahue [Sat, 26 Apr 2014 22:13:15 +0000 (15:13 -0700)]
Merge pull request #303 from longjon/hinge-loss-layer
HingeLossLayer
Jeff Donahue [Sat, 26 Apr 2014 22:09:17 +0000 (15:09 -0700)]
Merge pull request #368 from shelhamer/fix-data-layer-sequence-tests
Fix data layer sequence tests to avoid leveldb contention and lock failures
Evan Shelhamer [Sat, 26 Apr 2014 21:57:11 +0000 (14:57 -0700)]
scope data layer sequence tests to avoid leveldb contention
Evan Shelhamer [Sat, 26 Apr 2014 21:42:38 +0000 (14:42 -0700)]
set phase in dropout tests
Evan Shelhamer [Sat, 26 Apr 2014 01:54:12 +0000 (18:54 -0700)]
randomize order of test execution by make runtest
Jeff Donahue [Sat, 26 Apr 2014 01:42:37 +0000 (18:42 -0700)]
Merge pull request #366 from shelhamer/test-phase-fix
set TRAIN in CommonTest.TestPhase
Evan Shelhamer [Sat, 26 Apr 2014 01:33:40 +0000 (18:33 -0700)]
set TRAIN in CommonTest.TestPhase
...to avoid effects of test execution order. The original intention was
to check that TRAIN is the default, but it is better to have consistent
results.
Jonathan L Long [Fri, 25 Apr 2014 23:44:05 +0000 (16:44 -0700)]
test HingeLossLayer
Based on SoftmaxWithLossLayerTest.
Jonathan L Long [Fri, 25 Apr 2014 23:38:48 +0000 (16:38 -0700)]
make gradient checker's kink use feature absolute value
In theory, layer functions could be nonsmooth anywhere; in all cases in
use so far, they are nonsmooth at either zero or +1 and -1. In the
future, it might be necessary to generalize the kink mechanism beyond
this stopgap measure.
Jonathan L Long [Tue, 8 Apr 2014 03:35:33 +0000 (20:35 -0700)]
add HingeLossLayer for one-vs-all hinge loss
This layer implements a "one-vs-all" hinge loss, (1/n) sum_ij max(0, 1 -
y_ij x_ij), with bottom blob x_ij (i ranging over examples and j over
classes), and y_ij = +1/-1 indicating the label. No regularization is
included, since regularization is done via weight decay or using the
parameters of another layer. The gradient is taken to be zero at the
hinge point. This commit only provides the CPU implementation.
Evan Shelhamer [Fri, 25 Apr 2014 23:02:37 +0000 (16:02 -0700)]
Merge pull request #365 from longjon/pycaffe-empty-nets
Add unary CaffeNet constructor for uninitialized nets
Jonathan L Long [Fri, 25 Apr 2014 21:50:36 +0000 (14:50 -0700)]
pycaffe: add unary CaffeNet constructor for uninitialized nets
Note that parameters are uninitialized, not zero-filled.
Evan Shelhamer [Fri, 25 Apr 2014 19:51:22 +0000 (12:51 -0700)]
Merge pull request #363 from jeffdonahue/speedup-gradient-check
Gradient check speedups
Jeff Donahue [Fri, 25 Apr 2014 03:24:19 +0000 (20:24 -0700)]
eltwise gradient checker