Skip to content

Conversation

@samratkokula
Copy link

[docs] fix typo in interfaces.md

philkr and others added 30 commits August 25, 2015 13:22
[fix] properly backprop through ConcatLayer with propagate_down set
improve net config and shape mismatch error messages
Embed layer for lookup table of one hot encodings
Fix the MVNLayer tests so they actually test what they claim.

MVNLayer fixes: sum_multiplier_ sized correctly; backward gradient calculation.

Gradient calculation per analysis of seanbell, found here:
#1938

Fixes according to review comments.
Draw Deconvolution layers like Convolution layers
…==1 in SPPLayer

also, do nothing in SPPLayer Reshape if already reshaped once and bottom size unchanged
Fix SPPLayer top blob num and address `pyramid_height_ == 1`
Give the python layer parameter/weight blobs.
Fix EmbedLayer compiler warning for unused variable.
Previously, the prefetch GPU -> top GPU and prefetch CPU -> prefetch GPU
copies were launched concurrently in separate streams, allowing the next
batch to be copied in before the current one is read.

This patch explicitly synchronizes the prefetch -> top copy wrt the
host, preventing the CPU -> GPU from being launched until its
completion.
Fix a recently introduced race condition in DataLayer
Compute backward for negative lr_mult
Replaces CAffe_POSTFIX -> Caffe_POSTFIX.
This fixes a memory leak by using delete[] rather than plain delete.
Cleanup: Fixup capitalisation of Caffe_POSTFIX.
Commit 4227828 set the default binary format from HDF5 to BINARYPROTO to
fix #2885. This broke the cifar10 examples which relied on this default.

This commit specifies the snapshot_format explicitly since the rest of the
example relies on this being HDF5.
Fix some doxygen warnings about an undocumented argument in Blob and
incorrect documentation for SoftmaxWithLossLayer::Forward_cpu().
cifar10: Fix examples by setting snapshot_format.
eelstork and others added 27 commits November 20, 2015 16:52
GetDB must return a value.
Exclude core.hpp when building without OpenCV
Better normalization options for SoftmaxWithLoss layer
This `examples/lenet/lenet_stepearly_solver.prototxt` is introduced in #190 by mistake, since stepearly is never actually merged.
Remove bogus stepearly in MNIST example
Skip python layer tests if WITH_PYTHON_LAYER unset
replace snprintf with a C++98 equivalent
Deprecated OpenCV consts leading to compilation error
Safely create temporary files and directories
No more monolithic includes: split layers into their own headers for modular inclusion and build.
Remove dead preprocessor code for number of CUDA threads
[build] Display and store cuDNN version numbers for CMake
don't divide by 0 duration when downloading model binary
Remove hamming_distance and popcount
Fix compatibility issues with extract_features
A Python script for at-a-glance net summary
[docs] fix typo in interfaces.md
@seanbell
Copy link

Is this an accidental PR?

@samratkokula
Copy link
Author

Hi Sean,

I am sorry it was an accidental PR. Yesterday night I was trying to clone
the branch and accidentally PR has created.

I am new to git hub, I am facing a problem, can you please help me with it.
I have the master code downloaded from
#523. However I need to merge the code
from #3268 and
#1380 to my master code. I am using
github desktop to do this, I have tried many different ways to do this but
somehow code from #3268 and #1380 did not merge to my master code.

Can you please help me with this. Any help from you will be greatly
appreciated.

Thanks,
Samrat.

On Fri, Dec 11, 2015 at 8:02 AM, Sean Bell notifications@github.com wrote:

Is this an accidental PR?


Reply to this email directly or view it on GitHub
#3443 (comment).

@longjon longjon closed this Dec 11, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.