Viewing 0 current events matching “colloquium” by Date.
Sort By: Date | Event Name, Location , Default |
---|---|
No events were found. |
Viewing 9 past events matching “colloquium” by Date.
Sort By: Date | Event Name, Location , Default |
---|---|
Monday
Apr 6, 2009
|
CS Colloquium: Repairing software automatically using evolutionary computation – Portland State University Fourth Avenue Building (FAB) Stephanie Forrest University of New Mexico FAB 86-01 Abstract A pressing challenge for computer science over the next decade is reducing the total cost of software. This includes the billions of dollars that are lost each year from software defects. The number of software defects far outstrips the resources available for repairing them, and most software is shipped with both known and unknown bugs. This problem arises because human programmers still develop, maintain, and repair computer programs largely by hand, despite many years of progress in machine learning and artificial intelligence. The talk will describe recent research that shows how evolutionary computation can be combined with program analysis methods to automatically repair bugs in off-the-shelf legacy C programs. Once a program fault is discovered, evolutionary algorithms are used to generate program variants until one is found that both retains required functionality and avoids the defect in question. Standard test cases are used to represent the fault and to encode program requirements. Once a successful variant is discovered, structural differencing algorithms and delta debugging methods are used to minimize its size. Initial results will be presented on a wide range of C programs, including security vulnerabilities such as integer overflow, denial of service, format string, and buffer overflow. Finally, the talk will describe how the automatic repair mechanism can be combined with anomaly intrusion detection to produce a closed-loop repair system. Biography Stephanie Forrest is Professor and Chairman of the Computer Science Department at the University of New Mexico in Albuquerque. She is also an External Professor and has served on the Science Board and as Vice President of the Santa Fe Institute. Professor Forrest received M.S. and Ph.D. degrees in Computer and Communication Sciences from the University of Michigan (1982,1985) and a BA from St. John's College (1977). Before joining UNM in 1990 she worked for Teknowledge Inc. and was a Director's Fellow at the Center for Nonlinear Studies, Los Alamos National Laboratory. Her research studies adaptive systems, including evolutionary computation, immunology, biological modeling, and computer security. In security, she is best known for her early work using system calls for anomaly intrusion detection and her more recent work on automated diversity. She was a recipient of the NSF Presidental Young Investigator's Award and has recently served on the NSF GENI Science Council, the NSF CISE Advisory Committee, and the UCLA CENS Advisory Board. |
Friday
Oct 23, 2009
|
PSU CS Colloquium: Chaos in Computer Science – Portland State University Fourth Avenue Building (FAB) CS conference room, FAB 086-01 Abstract: Although it is not necessarily the view taken by those who design them, modern computers are deterministic nonlinear dynamical systems, and it is both interesting and useful to treat them as such. In this talk, I will describe a nonlinear dynamics-based framework for modeling and analyzing computer systems. Using this framework, together with a custom measurement infrastructure, we have found strong indications of low-dimensional dynamics in the performance of a simple program running on a popular Intel microprocessor—including the first experimental evidence of chaotic dynamics in real computer hardware. These dynamics change completely when we run the same program on a different Intel microprocessor, or when we change that program slightly. All of this raises important issues about computer analysis and design. These engineered systems have grown so complex as to defy the analysis tools that are typically used by their designers: tools that assume linearity and stochasticity, and essentially ignore dynamics. The ideas and methods developed by the nonlinear dynamics community are a much better way to study, understand, and (ultimately) design modern computer systems. This is joint work with Amer Diwan and Todd Mytkowicz. Computer Science Department University of Colorado at Boulder Biography: Elizabeth Bradley did her undergraduate and graduate work at MIT, interrupted by a one-year leave of absence to row in the 1988 Olympic Games, and has been with the Department of Computer Science at the University of Colorado at Boulder since January of 1993. Her research interests include nonlinear dynamics, artificial intelligence, and control theory. She is the recipient of a NSF National Young Investigator award, a Packard Fellowship, a Radcliffe Fellowship, and the 1999 student-voted University of Colorado College of Engineering teaching award. Host: Melanie Mitchell |
Monday
Apr 9, 2012
|
Real-Life Learning Agents – Portland State University FAB, Room 86-09 Abstract: Agents, defined as programs or robots that interact with their environment, are becoming increasingly common. However, the current generation of agents are often unable to robustly interact with each other, or with humans, severely limiting the number of tasks that they can accomplish. Furthermore, these agents typically are unable to adapt to their environment, a critical skill when programmers do not have full knowledge of agents' future environments or when the agents' environment may change over time. This talk will discuss recent work in combining autonomous learning of sequential decision making tasks with transfer learning, a general approach to sharing knowledge between agents with different capabilities, resulting in significant improvements to learning speeds and abilities. |
Tuesday
Apr 10, 2012
|
Efficiently Learning Probabilistic Graphical Models – Portland State University FAB, Room 86-09 Abstract: Probabilistic graphical models are used to represent uncertainty in many domains, such as error-correcting codes, computational biology, sensor networks and medical diagnosis. This talk will discuss two approaches to the problem of learning graphical models from data, focusing on computational challenges. The first is marginalization-based learning, where parameters are fit in the context of a specific approximate inference algorithm. This will include results on image processing and computer vision problems. The second is recent work on Markov chain Monte Carlo based learning, inspired by a computational biology project. |
Wednesday
Apr 11, 2012
|
Motors, Voters, and the Future of Embedded Security – Portland State University FAB, Room 86-09 Abstract: The stereotypical view of computing, and hence computer security, is a landscape filled with laptops, desktops, smartphones and servers; general purpose computers in the proper sense. However, this is but the visible tip of the iceberg. In fact, most computing today is invisibly embedded into systems and environments that few of us would ever think of as computers. Indeed, applications in virtually all walks of modern life, from automobiles to medical devices, power grids to voting machines, have evolved to rely on the same substrate of general purpose microprocessors and (frequently) network connectivity that underlie our personal computers. Yet along with the power of these capabilities come the same potential risks as well. My research has focused on understanding the scope of such problems by exploring vulnerabilities in the embedded environment, how they arise, and the shape of the attack surfaces they expose. In this talk, I will particularly discuss recent work on two large-scale platforms: modern automobiles and electronic voting machines. In each case, I will explain how implicit or explicit assumptions in the design of the systems have opened them to attack. I will demonstrate these problems, concretely and completely, including arbitrary control over election results and remote tracking and control of an unmodified automobile. I will explain the nature of these problems, how they have come to arise, and the challenges in hardening such systems going forward. |
Friday
Apr 13, 2012
|
Information Leakage from Encrypted Voice over IP: Attacks and Defenses – Portland State University FAB, Room 86-09 Abstract: In this talk, I describe two side-channel traffic analysis attacks on encrypted voice-over-IP calls and a novel technique for efficiently defending against such attacks. We begin with a review of the basics of speech coding to understand how and why information can leak out of an encrypted VoIP call. We then discuss the techniques for recovering hidden information: first, how to identify the language spoken in the call, and then how to spot particular phrases. Our techniques are completely speaker-independent, and require no recorded examples of the target phrase. Nevertheless, we show that they achieve surprising accuracy on widely-used speech corpora. Finally, we consider methods for limiting this information leakage. Experimental results show that an intelligent, adaptive adversary can convincingly deceive such traffic analyses while incurring much lower overhead than previously expected. |
Duckki Oe, Formally Certified Satisfiability Solving – Portland State University FAB, Room 86-09 Abstract: Satisfiability (SAT) and satisfiability module theories (SMT) solvers are efficient automated theorem provers widely used in several fields such as formal verification and artificial intelligence. Although SAT/SMT are traditional propositional and predicate logics and well understood, SAT/SMT solvers are complex software highly optimized for performance. Because SAT/SMT solvers are commonly used as the final verdict for formal verification problems, their correctness is an important issue. This talk discusses two methods to formally certify SAT/SMT solvers. First method is generating proofs from solvers and certifying those proofs. One of the issues for proof checking is that SMT logics are constantly growing and a flexible framework to express proof rules is needed. The proposal is to use a meta-language called LFSC, which is based on Edinburgh Logical Frame with an extension for expressing computational side conditions. SAT and SMT logics can be encoded in LFSC, and the encoding can be easily and safely extended for new logics. And it has been shown that an optimized LFSC checker can certify SMT proofs very efficiently. Second method is using a verified programming language to implement a SAT solver and verify the code statically. Guru is a pure functional programming language with support for dependent types and theorem proving. A modern SAT solver has been implemented and verified to be correct in Guru. Also, Guru allows very efficient code generation through resource types, so the performance of versat is comparable with that of the current proof checking technology with a state-of-the-art solver. |
|
Monday
Apr 16, 2012
|
Title: Information Discovery in Large Complex Datasets – Portland State University FAB, Room 86-09 Abstract: The focus of my research is on enabling novel kinds of interaction between the user and the information in a variety of digital environments, ranging from social content sites, to digital libraries, to the Web. In the first part of this talk, I will present an approach for tracking and querying fine-grained provenance in data-intensive workflows. A workflow is an encoding of a sequence of steps that progressively transform data products. Workflows help make experiments reproducible, and may be used to answer questions about data provenance – the dependencies between input, intermediate, and output data. I will describe a declarative framework that captures fine-grained dependencies, enabling novel kinds of analytic queries, and will demonstrate that careful design and leveraging distributed processing make tracking and querying fine-grained provenance feasible. In the second part of this talk, I will discuss information discovery on the Social Web, where users provide information about themselves in stored profiles, register their relationships with other users, and express their preferences with respect to information and products. I will argue that information discovery should account for a user's social context, and will present network-aware search – a novel search paradigm in which result relevance is computed with respect to a user's social network. I will describe efficient algorithms appropriate for this setting, and will show how social similarities between users may be leveraged to make processing more efficient. |
Monday
Oct 15, 2012
|
Research Talk: Agile Tooling for C++ – Portland State University FAB, Room 86-09 Title: Test-Driven Development and Mock Objects for C++ in Eclipse Speaker: Prof Peter Sommerlad, Institute for Software at FHO/HSR Rapperswil, Switzerland Abstract At IFS Institute for Software, several plug-ins have been developed for the Eclipse C/C++ Development Tools (CDT), to assist Agile C++ developers. Some of the features have already been integrated into CDT, such as the refactoring infrastructure and some refactorings, such as toggling function definition and declaration. In this talk Prof. Sommerlad will explain how IFS's plug-ins make it easier to adopt an agile style of development, through code-generation for Test-driven Development (TDD), unit testing, test doubles and mock objects, quick feedback from static analysis tools, and quick-fixes for problems. Speaker Bio: Prof. Peter Sommerlad is head of IFS Institute for Software at FHO/HSR Rapperswil. Peter is co-author of the books POSA Vol.1 and Security Patterns. His goal is to make software simpler by Decremental Development: refactoring software down to 10% of its size with better architecture, testability and quality and functionality. Peter is the also the author of the CUTE unit testing framework. He inspired and leads several Eclipse CDT plug-in projects, such as the CUTE unit testing, Sconsolidator, Mockator, Linticator, and Includator. IFS contributed most of the CDT refactoring infrastructure and is employing it to develop further TDD and Refactoring support for Eclipse CDT. |