Export or edit this event...

Portland Machine Learning Meetup - PDX ML

Uncorked Studios
811 SE Stark St.
Portland, or 97214, us (map)

Website

Description

Details
Same place, still looking for speakers. If you have anything you would like to present, let me know!

If you need parking, there's a parking deck below Safeway.

Agenda:

6:00 p.m.: Food, beverage, and networking

6:40 p.m.: Welcome message by Karl Fezer

6:45 p.m: Speaker 1: "Transfer Learning, or How to Stand on the Shoulders of Giants" - James DiPadua
7:30 p.m: Speaker 2: "Quantum Mechanics for Modeling Composite Semantic Spaces" - Connor Favreau
8:15 p.m.- 8:30: Project Ideas. Pitch your Project Ideas to this meetup group

8:30: End

Speaker 1 Details:

Abstract:

There's been a mountain of research into Deep Neural Networks' practical applications in image, audio and text processing. But these deep networks are often built on large corpuses of data (such as ImageNet or Wikipedia).

But that may not apply directly to your domain. Gathering data specific to your problem space may not only be a lengthy process but an expensive one too. That makes the business win a hard sell.

Transfer Learning can dramatically eliminate many of those problems, quickly.

In 'How to stand on the shoulders of giants,' we'll discuss the research background into Transfer Learning and how to implement the process in either Keras or PyTorch. The goal is for listeners to feel comfortable with the concept and prepared to begin researching an application in their workspaces.

Bio:

James is a wanderer, tinkerer, and ponderer. Not one to be pinned down, he's more comfortable in the abstract than in the known. He embraces ambiguity with a bearhug. That's a trick of course. He bear hugs the ambiguity into little mathematical boxes and then says "Dance!" and, oh, how that ambiguity dances! James currently hangs his hat at Vacasa where he works as a Senior Data Scientist tackling a myriad of growth-objectives with engineering and machine learning.

Speaker 2 Details:

Five years ago, Word2Vec offered a leap forward for the average data scientist to perform efficient algorithms in Natural Language Processing. From a body of text, Word2Vec generates a semantic space, in which the trained word vectors are often highly associated with their meaning. The next leap, a semantic space for phrases and sentences, proves tougher both computationally and in faithfully representing a composite meaning over multiple words. Surprisingly, quantifying particle interactions a la quantum mechanics shares close mathematical similarity to quantifying the meaning of words, phrases, and sentences. In this talk, I will provide an overview of current techniques in modeling language past word vectors, as well as point out the quantum mechanical aspects of these techniques. Emphasis will be placed on the “Compositional Distributional Semantics” model for the task of identifying word ambiguity."

Share

Tags