Fall 1998 - readings and advanced topics in multimedia - students

Fall 1998 - readings and advanced topics in multimedia

Name E-Mail Address Ideas
Barrera, Jose W jb238@is8.nyu.edu evolutionary music
Biermann, Henning* biermann@cs.nyu.edu Motion tracking
Chapman, Emily emily@cs.nyu.edu Children generating animation w/Improv (Intellistories)
Crutcher, Henry crutcher@cs.nyu.edu Children generating animation w/Improv (Intellistories)
Diament, Judah diam7644@cs.nyu.edu An Improved FilmFinder
Fu, Xiaodong xf203@is8.nyu.edu Improv + personality
Hertzmann, Aaron hertzman@mrl.nyu.edu Randomized algoriths for painterly style rendering
Hui-Wen, Su* huiwen@puma.mt.att.com Improv
Jen, Yunyung jen7655@sparky.cs.nyu.edu Multiperspective projection (w. Denis Zorin)
Lin, Larry lin7667@cs.nyu.edu Visualizing code w. multiscale interfaces
Meyer, Jon meyer@mrl.nyu.edu Constraints for zooming interfaces
Mitchell, Chris mitc7205@cs.nyu.edu Zooming into cities
Schneider, Fred fws201@is8.nyu.edu Visualizing 4D objects
Sudo, Kiyoshi ks341@is8.nyu.edu Lip reading (with Chris Bregler)
Wang, Hua wanghua@cs.nyu.edu Haptic feedback of textures
Zhang, Xiaoge xiaoge@cs.nyu.edu A.I. + Improv; 2 actors cooperating on a task
Zheng, Jian jianzhen@cs.nyu.edu A.I. to evolve Improv personality

Various students' research ideas

Jose Barrera:

  1. Evolutionary Music.
  2. Cyber-Plants (Artificial Life):
    1. Create a box with some sensors for light, temperature, humidity, sound, traffic from the net...
    2. Attach the output of these sensor to the computer.
    3. Create a fractal-like plant that grows with time, and the inputs that determine their growth are given by the sensors box.
    4. Create a colony of "cells" that grow where the environment conditions are affected by the sensors box.
    Goals: See how cumulative inputs from the sensors affect the overall growth. To define the fractal that is more likely to behave like a living organism. Create a set of rules for the growth of the plants and/or the cells

    Commercial goals: Create a screen saver like product; Create a virtual-lab where you can build your own "creatures"

Emily Chapman and Henry Crutcher:

Using the improv system to create an interactive story creator for children. Children will be able to add scenery, choose and name the characters, modify their appearances, and give then certain behaviors using the improv system.

This week - created our prototypical character, who can turn her head, and wiggle it. (at the same time! :-)) Note: the vrml code is on Emily's home page.

  • First stage: learn the improv system better by creating some high level states for linda; modify source code so every time we reload or resize to remove spash screens. (needless to say, can we have the source code :-)???)
  • Second stage: figure out how to efficiently create copies of Linda all running in the same scene.
  • Third stage: create interface to add scenery and extra Linda's
  • Fourth stage: expand interface to modify characters' appearance and behaviors. Add text input
  • Fifth stage: figure out how to save page to the server
  • Sixth stage: add multiple pages.
  • Aaron Hertzmann

    Randomized algoriths for painterly style rendering

    The goal of this project is to create a painting algorithm that can paint in a wide variety of styles, beginning from photographs, 3D models, and video. The key is the use of an energy functional to specify the style, and a relaxation framework for computing the painting. The energy function measures how close a painting adheres to a given style. The appeal of this method is that the designer specifies how the painting should look, rather than giving an algorithm for painting. The energy function can embody different ideas about what is desirable in a painting, such as economy of line, or adherence to the source image. Styles can be interpolated simply by interpolating their energy functions. We envision an interactive authoring system that will allow the user to edit styles with a GUI or recursive ZUI during the painting process. We would also like to be able to specify different styles for different image regions, with a painting interface.

    The relaxation algorithm creates a painting by trial-and-error: changes are proposed to the painting, and any change that improves the painting is accepted. Simply using random changes takes too long --- the trick is to select possible changes with a high probability of success.

    Xiaodong Fu:

    Improv + personality

    My project will be some work on IMPROV system. My major interest is on the personality and emotion representation , computation , the interactive influence between user and characters , and the easy way for author to specify these factors when he create the charactor .


    Detailed desription:

    My project aims to construct a character (or some characters ) in a enviroment , facing several choices , and do something ( like walking towards the table and picking up its favorite fruit ) . Like the example of SID & THE PENGUINS , the user can change the action of the character and create their own performance , some control over the character's personality will be added ( for example if the character is shy, he walk in small pace , head down when sb stare at him... ) , and some emotion expressing functionality will be considered to be added ( for example if the user give apple to the charater and the character like banana more , he should look like not very happy at that time ) . As to adjusting , I hope to quntified these factors and make author be able to change them at runnung time.


    1. First I need to get familiar with IMPROV system ASAP .
    2. The second step will be construction the environment and the character with adjustable personality
    3. In 3rd step , effort will be focus on adding interactive aspect for character ( that is , a virtual character for user will be added and the its action is control by the user ).
    4. Then add the emotional expressing function for the character.
    Besides the programming , I need to read more detailed documentation on IMPROV , and I also need to read some papers on emotion and personality expressing of virtual characters, I am very interested in Clark Elliott's work in this field.

    Larry Lin:

    Code Visualization

    Problem: Programming is a complex task. The process of producing an algorithm to solve a problem involves complexity. Furthermore, once an algorithm is produced, the process of verifying that it actually works is yet another aspect of that complexity.

    Programming inherits some of this complexity from the way code is represented. Programming languages impose that code be represented as symbols linked together in a sentence-like fashion. When code grows to a substantial complexity, it becomes a challenge to follow what it is doing because it appears as a jumble of text. For example, trying to follow a large sequence of "if-then-else" statements can be extremely frustrating. Or trying to keep track of a chain of procedure calls with no other help but one's memory is similarly frustrating.

    Proposed Solution: By providing a visual representation for code, one can gain a sharper understanding of how the code works in both the production and test phases.

    Implementation: I will implement a code-to-diagram translator that will translate a subset of Ada language constructs into diagrams. Once the diagrams are generated, one may execute the program and watch the execution flow among the componenents of the diagrams. Each component is capable of being zoomed in and out, which in effect reflects the level of call hierarchy. As the program executes, data modifications will be shown in a window to reflect the status of the execution.

    Christopher Mitchell:

    Zooming into cities
    Other projects to do eventually:

    Hui-wen Su:

    Improv Texture/Effects

    special texture creation: I am interested in manipulating special texture mapping . It seems that Improv will still use VRML as GUI, and there seems to be not many kinds of texture in VRML

    Improv Effects: As regard as Effects, I am interested to apply the crowds activity to some effects in Improv, such as sparking water, snow, ....

    Detailed description:
    1. Texture: My project hopes to build new texture mapping on both environment and character in Improv, such as directly draw the texture to the modle, or add more different texture mapping methods. Having more complicated texture , and hope not to increase too much rendering time on the GUI side.
    2. Effects: Hope to apply particle into the environment (or the character), such as water or fire in Improv.
    1. First I need to understand what kinds of texture mapping VRML supplies, and how does Improv Crowds work.
    2. Then I need to get familiar with the texture and particle theory and algorithm.
    3. create and apply particle package / texture package into Improv.
    4. create effects / texture in Improv along with other characters and the environment.
    about facial motion interface in last class I designed 2 background interface for facial improve on http://www.geocities.com/Hollywood/Picture/4373/face2.html

    Hua Wang:

    Application of a haptic device

    Goals: Use the haptic device in a graphics application, so the object created can not only be seen, but can also be touched. There are two parts to this project. One is to write a program to transform an Open Inventor file into a Phantom scene graph, so any object model represented by open inventor file can integrate with the Phantom device to give a real feeling of a three dimensional existence. The second part of the project is to use the haptic device to feel computer generated textures. One possible approach is to represent the texture by a mesh surface in the phantom scene graph. One possible application is the haptic version of the fractally generated world applet.

    Zhang, Xiaoge:

    I will work on IMPROV project. My goal is to develop actors who have some "intelligence" and can co-operate on some degree. This includes:
    1. Create characters.
    2. Let actors have the ability to find way to solve the problem.
    3. Make actors interactive with each other, and they can co-operate to solve the problem. They can exchange the information and adjust to the changes of the enviroment.
    4. User can offer some help or act as one of the actors.
    To do this, I need first read the documentation and get familiar to IMPROV. Besides programming, I have to read AI articles to develop co-operative actors.

    Jian Zheng:

    A.I. to evolve Improv personality

    Goals: Make the personalities of characters able to evolve with events. For example, some characters are more aggressive than others; as other characters approach too close to these agressive characters It is very likely that they will attack. At the same time, the whole character society is evolving. If almost all persons in the scene are polite, that is, they are not inclined to attack others, then the aggressive ones gradually become more and more polite. We could also add some mutation in the personalities. That is, characters sometimes may change suddenly from one type to the other.

    Note: Until now, I do not see the inter-operation between characters and how they communicate. I hope after looking through the current documents for Improv, we can find some interface of the system to add new functionanlity and interactivities to characters.