Viewing 0 current events matching “meetup:event=ldwlhqyxpblb” by Date.
Sort By: Date | Event Name, Location , Default |
---|---|
No events were found. |
Viewing 1 past event matching “meetup:event=ldwlhqyxpblb” by Date.
Sort By: Date | Event Name, Location , Default |
---|---|
Thursday
Nov 8, 2018
|
Portland Machine Learning Meetup - PDX ML – Uncorked Studios Details
If you need parking, there's a parking deck below Safeway. Agenda: 6:00 p.m.: Food, beverage, and networking 6:40 p.m.: Welcome message by Karl Fezer 6:45 p.m: Speaker 1: "Transfer Learning, or How to Stand on the Shoulders of Giants" - James DiPadua 8:30: End Speaker 1 Details: Abstract: There's been a mountain of research into Deep Neural Networks' practical applications in image, audio and text processing. But these deep networks are often built on large corpuses of data (such as ImageNet or Wikipedia). But that may not apply directly to your domain. Gathering data specific to your problem space may not only be a lengthy process but an expensive one too. That makes the business win a hard sell. Transfer Learning can dramatically eliminate many of those problems, quickly. In 'How to stand on the shoulders of giants,' we'll discuss the research background into Transfer Learning and how to implement the process in either Keras or PyTorch. The goal is for listeners to feel comfortable with the concept and prepared to begin researching an application in their workspaces. Bio: James is a wanderer, tinkerer, and ponderer. Not one to be pinned down, he's more comfortable in the abstract than in the known. He embraces ambiguity with a bearhug. That's a trick of course. He bear hugs the ambiguity into little mathematical boxes and then says "Dance!" and, oh, how that ambiguity dances! James currently hangs his hat at Vacasa where he works as a Senior Data Scientist tackling a myriad of growth-objectives with engineering and machine learning. Speaker 2 Details: Five years ago, Word2Vec offered a leap forward for the average data scientist to perform efficient algorithms in Natural Language Processing. From a body of text, Word2Vec generates a semantic space, in which the trained word vectors are often highly associated with their meaning. The next leap, a semantic space for phrases and sentences, proves tougher both computationally and in faithfully representing a composite meaning over multiple words. Surprisingly, quantifying particle interactions a la quantum mechanics shares close mathematical similarity to quantifying the meaning of words, phrases, and sentences. In this talk, I will provide an overview of current techniques in modeling language past word vectors, as well as point out the quantum mechanical aspects of these techniques. Emphasis will be placed on the “Compositional Distributional Semantics” model for the task of identifying word ambiguity." |