Available Projects for Summer 2014
Mentor: Kostas Bekris
Project Title: Demonstrating Tasks to Robots and Interactive Planning
The student involved in this project will have the opportunity to work with a state-of-the-art robot (dual-arm Baxter robot from Rethink Robotics) and study interaction problems between robots and people. Some of the questions in this context are the following:
- How can a person effectively specify a task to a robot? What kind of sensing input should be used to communicate a task to the robot? The objective is for a human to be able to demonstrate a task to a robot both visually and verbally, the same way that one would do to a fellow human. The robot needs to reason about the demonstration and identify out of this process, the task that needs to be completed, such as assembling a product from individual parts.
- How can a robot plan its actions so as to solve that task? Given the task specification, the robot then solves the corresponding motion planning challenge. The interaction between the robot and the human can continue until the task has been effectively learned and the motion planning sequence satisfies the requirements of the human. In the process of solving the task, the robot should be able to operate safely next to people and robots.
The student will prepare a proper experimental setup where different alternative methodologies for the above processes can be implemented and evaluated. There will be the opportunity to work with and evaluate methods on a robotic manipulator, where it will be possible to safely evaluate potential solutions on real-world benchmarks.
Requirements: The student should be familiar with programming in C++ and Boost as well as be willing to learn the basics of ROS, the "Robot Operating System" by Willow Garage and related open-source libraries for motion planning.
Mentor: Jacob Feldman
Project Title: A world of autonomous agents.
Feldman and graduate students have recently created a programmable artificial life environment containing a population of autonomous virtual agents. The platform has two complementary goals: a) to create model-driven autonomous agents capable of intelligent behavior, and b) to investigate human interpretation of that behavior, focusing on a timely topic in cognition, namely, the perception of intentionality. Each agent is an autonomous program capable of a variety of behaviors, such as foraging for food, interacting with other agents, and even competing for survival. The agents are endowed by their programs with goals, intentions, perceptual capacities, the capacity to explore and learn in their environment, and the capacity to plan their future behavior. Programming new agents is a creative but feasible exercise for novices because the agents' programs are written in a modular language built on basic behavioral and perceptual building blocks. Undergraduate researchers can endow them with new perceptual faculties, improved action planning, or even new goals and competitive frameworks. They can also assess how well and under what conditions human observers are able to interpret the agents' mental states, that is, deduce what the agent was "thinking" based on its actions. This virtual environment thus serves as a novel platform for investigating key problems at the interface of computer science and perceptual psychology, including: visual interpretation of complex dynamic displays, population dynamics of artificial perceptual agents, and computational procedures for interpretation of intentional action. Students with a programming background can participate in the development of new agents and interactive interfaces.
Shapes are often represented according to their part structure: examples include the "skeletal" model developed by Singh and Feldman. Such models can be used to recognize objects and create abstract imagery by simplifying and removing parts, and computation of relevant geometric features such as local symmetries. Projects can refine these models by representing new kinds of part organization (e.g., operations that create texture) or encoding constraints on part structure for specific classes of shape (biological forms or man-made artifacts or moving shapes).
Mentor: Randy Gallistel
Project Title: Automated Cognitive NeuroGenetic Screening Using a Custom Matlab Toolbox
Run a platoon of genetically manipulated mice through a computer-controlled sequence of tasks in a fully automated 24/7 live-in test environment and analyze the data using a Matlab Toolbox developed for analyzing the kind of time-stamped event record obtained in these experiments. Project involves learning to use the toolbox and, in the process, helping to improve the manual and tutorial materials that we are preparing to enable other labs to use this system. The tests are designed to detect heritable malfunctions in basic mechanisms of cognition, particularly those mechanisms that enable animals to localize themselves in space and time.
Requirements: Course work in statistics; Matlab programming skills.
Mentor: Pernille Hemmer
Project Title: Modeling human belief updating.
The goal of this project is to develop a computational framework for assessing individual differences in dynamically changing beliefs. The project brings together three lines of research bearing on the assessment of subjective beliefs. The first stems from experimental work quantifying prior beliefs in naturalistic environments, which shows that beliefs influence episodic memory and decision-making. The second arises from rational models of cognition and asks why using beliefs in memory and decision-making is an optimal strategy. The third comes from Bayesian data analysis to infer latent psychological parameters, e.g., beliefs, in a given cognitive model, and asks if there are individual differences in these psychological processes.
This project will require students to use a combination of behavioral experimentation and Bayesian modeling to investigate how people update their beliefs in changing environments. The project relies heavily on programming, both for designing an experimental interface and the model development.
Requirements: Programming skills (Matlab preferred)
Mentor: Eileen Kowler
Project title: Visual search
Visual search is an operation that is performed by humans and by machines (for example: robots navigating through cluttered environments, or people searching through a desk for a set of keys). This project will require students to develop a simple but realistic visual search task for humans to perform (for example, finding an object hidden under another on the screen) where the likelihood of finding the target in a particular location is traded-off against the cost (search time or effort) of accessing the most likely locations. Performance will be measured by recording movements of eye or arm. The search pattern adopted by the human participants will be compared to models of optimal strategies. For further information about lab activities, see: Prof Kowler's web page.
Requirements: Matlab programming skills. Coursework in statistics recommended.
Mentor: Melchi Michel
Project Title: Temporal integration of visual information across eye movements
Humans make rapid, ballistic eye movements several times per second and our mental representation of the visual environment is somehow built up from the resulting sequence of views. How accurately do we store information from individual views and how efficiently do we combine information across a sequence of views? This project will involve developing models of visual integration across multiple eye movements and designing visual experiments to test these models. Among the available tools are an accurate high-speed eye tracking system and software for real-time gaze-contingent updating of the display.
Requirements: Interested students should have coursework in statistics and/or probability theory and basic programming experience in C/C++, Matlab, and/or Python. Experience with graphics programming in OpenGL, knowledge of linear algebra, and experience with experimental methods would also be helpful but are not required.
Mentor: Elizabeth Torres
Project: Models of human movement
Build an interface that captures positions and orientations of real time hand movements (using available sensor data) and outputs "fake" feedback in real time. The underlying algorithm should fool a human performer into believing they are moving in a certain way, which in reality is different from the actual way in which they are moving. Trial to trial repeats should be stochastic enough that reveal no possible pattern, yet in the long run some pattern should emerge to drive the motor and perceptual systems to adopt new performance patterns.
Requirements: Knowledge of Matlab and C++ is recommended.
Perceptual Science and Technology REU Program 2014
Perceptual Science Home Page
RuCCS Home Page
Computer Science Home Page
Document last modified on October 23, 2015.