Linguistics Connections

Admittedly, I found the chapter difficult to understand, having no background with neural networking. But, after talking to Priscy about it, I feel like I have a better idea. I can better appreciate the methods for helping these networks “learn”, especially through back-propagation. What was most interesting to me, however, was the comparisons between these types of models and linguistics. As a linguistics major, I have still taken surprisingly few linguistics classes, so I appreciated this method of assessing data. I really enjoyed looking at the tensor product representation of the sentence “Mary gave the book to John”. It struck me as being similar to the structure of syntactical trees, which have the same idea of generalizing a role (such as noun phrase, prepositional phrase, down to something as finite as ‘tense’) and a filler which suits each sentence to fit the tree. Thinking of it in this way helped me comprehend better the computational version.

I hope that, in class, we can further discuss the methods these networks use for learning. The ideas discussed in this paper interested me, but were phrased in ways that were difficult for me to read and really comprehend.

Comments are closed.