LIGHTING INTERACTIVE DESIGN:FIANL DESIGN
LIGHTING INTERACTIVE DESIGN
———
Group 8
Advanced Modeling And Animation/ Bachelor of interactive spatial design
———
Because this semester coincided with the Chinese New Year, we came up with the theme of "Nian" — the New Year monster. Actually, before that, we had considered several other ideas, like "Emotion Template" and "Gold Miner," but they were all rejected. The teacher told us they would be too difficult to implement. In the end, we decided to go with a simple presentation, which is how "Nian" came about.
Background:
Long, long ago, in a peaceful mountain village in China, people led a peaceful life. Every New Year's Eve, a fierce monster named "Nian" emerges from the deep mountains. It has long antennae on its head and a huge mouth. Wherever it goes, villages fall into ruins and people flee in terror. Later, people discovered that the Nian Beast was afraid of red, flames and loud noises. Thus, the customs of pasting Spring Festival couplets, setting off firecrackers and lighting lanterns emerged. Since then, the Nian Beast has never dared to approach the village again. However, as time goes by, the hustle and bustle of modern society gradually drowns out the ancient legends. So we want to use technological means to let people experience this ancient legend and feel the sensation of repelling the Nian Beast.
The Three Wonders of Particle Systems: Fireworks, Flames, and Wind:
In real-time interaction and visual effects, particle systems are among the most fundamental and versatile tools available. By manipulating the movement and state changes of countless individual particles, they can simulate a wide range of effects, from natural phenomena to fantastical magic. This article uses three typical examples—fireworks, flames, and wind—to illustrate their particle movement logic and gesture-triggered interaction methods.
01 Fireworks: Ascent and Radial Explosion
The fireworks effect consists of two distinct phases.
Phase 1: Ascent
The fireworks gain initial velocity from the ground and ascend while decelerating under the influence of gravity. This phase relies solely on basic dynamics formulas, with particles moving along a parabolic trajectory.
Stage 2: Explosion
When velocity drops to zero, the fireworks reach their apex, instantly generating a large number of particles. These particles disperse in all directions via a radial effect, forming a spherical or circular diffusion pattern. After the explosion, each particle continues to fall under the influence of gravity and gradually fades out as its lifespan expires.
This effect is triggered by the “Open” gesture (opening the palm).
02 Flame: Upward Motion and Random Perturbation
Flames are also based on a particle system, but their motion mechanism differs significantly from that of wind.
The primary driving force for flame particles is the rise of hot air, so the particles move upward in a vertical direction as a whole. To simulate the flickering sensation of real flames, random perturbations are applied to the velocity vector of each particle, resulting in natural, irregular movement trajectories.
Additionally, as particles reach the end of their lifecycle, their opacity gradually decreases, creating a fade-out effect to avoid the abruptness of sudden disappearance.
The three core elements of the flame effect can be summarized as:
· Upward Movement
· Random Perturbation
· Fade-Out Effect
This effect is triggered by the “Waving” gesture.
03 Wind: Directional Continuous Flow
The wind effect simulates airflow using a particle system. The system continuously generates fine, line-shaped particles, using motion blur to create the perception of “moving wind.”
The key lies in directional control: the wind’s direction is not random but is calculated in real-time based on the vector from the character to the target (Direction from Character to Target), ensuring the wind always points toward the target object and creating a clear spatial directionality.
In terms of operational mechanics, the system generates new particles every frame while removing those that have reached the end of their lifecycle, thereby maintaining a stable particle count and creating a continuous, flowing wind effect.
This effect is triggered by a “fanning” gesture, establishing a direct mapping between user actions and visual feedback.
Summary
From the initial velocity + gravity + radial explosion of fireworks, to the upward motion + random perturbations + fade-out of flames, and finally to the wind’s direction vector + per-frame updates + continuous flow, particle systems can construct rich dynamic visual effects by combining several fundamental physics and graphics concepts. Combined with gesture recognition, these effects are no longer just animations on the screen but become a more natural language of interaction.
Implementation of Gesture Recognition Logic:
In this group project, I was responsible for the hand gesture recognition module. The core task was to accurately identify three gestures from the hand keypoints returned by MediaPipe: open hand (firecracker), horizontal waving (fire), and vertical fanning (wind). The following is the process of designing the gesture recognition logic.
1. Overall Approach
MediaPipe returns coordinates of 21 keypoints per hand per frame. Based on these coordinates, I implemented two types of judgments:
· Static gestures: open hand, determined by the positional relationships between keypoints.
· Dynamic gestures: horizontal waving and vertical fanning, determined by tracking hand movement trajectories.
To maintain stable state output, a debouncing mechanism was also added.
2. Open Hand Detection
Detecting an open hand requires checking whether all five fingers are extended. MediaPipe provides the positions of fingertips and the corresponding base joints (MCP):
· For the index, middle, ring, and little fingers, compare the y-coordinates of the fingertip and base. In the image coordinate system, a smaller y value indicates a higher position. When the finger is extended, the fingertip is usually higher (smaller y) than the base.
· For the thumb, due to its different orientation, the comparison is done on the x-coordinate: for the right hand, the thumb tip x is greater than the base x when extended; for the left hand, it is the opposite.
When all five fingers meet the extension condition, the hand is considered open.
To improve accuracy, I did not rely on a single frame. Instead, I required the open-hand condition to be satisfied for several consecutive frames before outputting a valid state.
3. Horizontal Waving and Vertical Fanning Detection
These two gestures are dynamic and require analyzing hand displacement. I maintained a trajectory buffer for each hand, storing palm center coordinates from the most recent frames.
Horizontal Waving (WAVING)
· Check whether both hands are present.
· Calculate the horizontal displacement of each hand between the current frame and the historical frame(s) in the buffer.
· If both hands have horizontal displacement exceeding a preset threshold (e.g., 50 pixels), it is recognized as horizontal waving.
Vertical Fanning (FANNING)
· Similarly, check for the presence of both hands.
· Calculate the vertical displacement of each hand.
· If both hands have vertical displacement exceeding the threshold, it is recognized as vertical fanning.
The displacement thresholds were determined through repeated testing to ensure that normal slight movements do not trigger the gesture, while deliberate large motions are reliably detected.
4. State Debouncing
If the state were updated directly per frame, minor hand jitter would cause frequent state fluctuations. I introduced a debouncing mechanism:
· When a gesture is detected, it does not immediately change the state. Instead, it enters a “pending” state and starts a timer.
· Only if the same gesture persists for a set duration (e.g., 10 ms) does the current state update to that gesture.
· If the gesture disappears or changes during the pending period, the timer resets.
This approach ensures fast response while filtering out brief fluctuations.
5. Handling Special Cases
· Single-hand scenario: Horizontal waving and vertical fanning require both hands. If only one hand is present, only open-hand detection is considered.
· Hand loss: MediaPipe may occasionally lose one hand for a short period. I used the trajectory buffer to smooth this: if the previous frame had two hands but the current frame only has one, I temporarily reuse the previous frame’s data for the missing hand to prevent the action from being interrupted prematurely.
6. Parameter Tuning and Results
The recognition logic depends on several key parameters: displacement thresholds, trajectory buffer size, and debounce time. All parameters were centralized in a configuration object for easy adjustment. After extensive testing, the final parameters yielded stable recognition rates under normal lighting conditions, with an acceptable false-trigger rate.
Summary
The core of the gesture recognition logic lies in distinguishing static from dynamic gestures: using keypoint coordinates for finger extension analysis, using trajectory buffers for hand motion analysis, and applying debouncing to ensure state stability. This design decouples the module from the rest of the project and leaves room for future expansion to support more gestures.
Physical background board
First, we brainstormed and discussed various ideas, exploring different creative directions and project possibilities. At this stage, we discussed various concepts, shared our perspectives, and evaluated which ideas were most feasible and meaningful. Next, we collected a wealth of relevant materials to support the refinement of our concepts. Based on these materials and the initial discussions, we carefully organized the content according to the specific location and requirements of each task, creating an initial draft. This draft served as a basic framework, enabling us to better conceive the overall structure and further refine it in subsequent stages.
After completing the design draft, our group discussed the production approach and unanimously decided to use cardstock as the main material, as we felt it closely matched the visual style and texture of the Nian Beast theme we had designed. Cardstock is not only easy to shape and assemble, but it also provides a bold and structured appearance that enhances the overall aesthetic of our work.
To prepare for the production, we purchased various materials online, including different types of cardstock and cardboard. These materials allowed us to refine and decorate the details of our model more effectively.
By combining layering, cutting techniques, we were able to further enhance the visual impact and bring our design concept to life in a more vivid and cohesive way. We added and modified details during the production process.
Comments
Post a Comment