The Wayback Machine - http://web.archive.org/web/20200130170652/https://github.com/topics/graph-learning
Skip to content
#

graph-learning

Here are 13 public repositories matching this topic...

dylanwinty
dylanwinty commented Aug 17, 2019

base_layers.py中:embedding 的call
def call(self, inputs):
shape = inputs.shape
inputs = tf.reshape(inputs,[-1])
output_shape = shape.concatenate(self.dim)
output_shape = [d if d is not None else -1 for d in output_shape.as_list()] #//tensorshape->[,,] list
return tf.reshape(tf.nn.embedding_lookup(self.embeddings, inputs),output_shape)

**GraphSage-ShallowEncoder e

linas
linas commented Jul 10, 2019

There is some kind of thread-race condition described at lines 550-575 of link-pipeline.scm -- its talking about problems that arise when the same sentence is fed through the system in rapid succession. Apparently, the code is trying to remove Atoms to reduce clutter, but is sloppy in doing so. There is no reason to be sloppy; this should be fixed. The best way for doing this might be to use `(

Improve this page

Add a description, image, and links to the graph-learning topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the graph-learning topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.