brain of mat kelcey

( with a fun exploration of easy/hard positive/negative mining ) say we want to train a neural net whose output is an embedding of its input. how would we train it? to train anything we need a loss function and usually that function describes exactly what we want the output to be in the form of explicit labels. but for embeddings things are a little different. we might not care exactly what embedding values we get, we only care about how they are related for different inputs. one way of s

1 mentions: @mat_kelcey
Date: 2019/06/13 17:16

Referring Tweets

@mat_kelcey "pybullet grasping & time contrastive network embeddings" blog: https://t.co/HtTkbRM8rF code: https://t.co/JNAux3XYwt

Related Entries

Read more Neural Text Embeddings for Information Retrieval (WSDM 2017)
5 users, 0 mentions 2018/12/05 22:16
Read more Neural networks for Graph Data NeurIPS2018読み会@PFN
25 users, 9 mentions 2019/01/26 09:46
Read more Document Embedding Techniques - Towards Data Science
0 users, 4 mentions 2019/09/09 13:56
Read more [DL輪読会]Learning an Embedding Space for Transferable Robot Skills
0 users, 0 mentions 2018/04/24 10:16
Read more Document Embedding Techniques - Towards Data Science
0 users, 6 mentions 2019/09/12 19:05