• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • ART Lab
  • Research
  • Publications
  • ART Studio
  • People
  • News
  • Contact Us

Adaptive Robotics & Technology Lab

ART Lab

Texas A&M University College of Engineering

ART Studio

ART Studio features creative intersection between arts and robotics. We introduce some of our recent ‘ART + Robotics’ projects as well as sharing upcoming initiatives and events. If you are interested in joining ART Studio or exhibitions, please contact the PI Kiju Lee.

Sound of the Bloom @ The Kimbell, Bryan, TX, April 2026

Stay tuned for more information about upcoming exhibition in April 2026.

The Bloom in Mixed Reality @ Arts in Robotics, ICRA 2025

https://art.engr.tamu.edu/wp-content/uploads/sites/170/2026/01/theBloominMR.mp4

Created by  Neha Vemula (MEEN Grad), Naomi Drori (MEEN Undergrad), Paul Moubarak (MEEN Undergrad), Yuan Wei (MEEN Ph.D in 2025), Hayyam Iqbal (MEEN Grad), and Dr. Kiju Lee.

The Bloom in Mixed Reality showcased at ICRA’2 Arts in Robotics Exhibition in Atlanta, GA, May 2025. We introduced a new collection of “blooming” flowers integrated with virtual butterflies, creating an immersive, interactive experience for audiences. Using Microsoft HoloLens 2, the system detects an audience member’s hand, allowing a virtual butterfly to fly, land, and be gently guided toward the flowers. An embedded sound system plays soft ambient music, further enhancing the atmosphere. Together, these elements create a uniquely engaging and memorable mixed-reality experience.

The Bloom @ SoundBox, San Francisco Symphony, 2024

https://art.engr.tamu.edu/wp-content/uploads/sites/170/2026/01/theBloom-small.mp4

Created by  David Ho (MEEN BS ’24), Hayyam Iqbal (MEEN Grad), Naomi Drori (MEEN Undergrad), Paul Moubarak (MEEN Undergrad), and Dr. Kiju Lee.

The Bloom, an interactive robotic flower art installation, was showcased at the San Francisco Symphony’s SoundBox event, curated by Carol Reiley, on April 5–6, 2024.

The installation is composed of modular hexagonal panels that can be assembled in varying numbers and configurations. The presented piece featured three panel types: one incorporating dual servo motors to produce origami “blooming” motions, another enabling a single rotating motion, and a third dedicated to dynamic LED lighting. The origami forms were carefully selected and arranged to illustrate a visual progression—from geometric and static structures to more organic, curved, and flower-like designs. All motions and lighting effects are fully programmable through integrated sensors and onboard computing units, enabling responsive and expressive interaction.

Figure. CAD models of the three panel types. The universal light panels house LED strips. Magnets hold layers of each panel.

Recent Posts

  • The BLOOM at an Soundbox event at San Francisco Symphony June 13, 2024
  • PhD Defense: Kangneoung Lee March 16, 2024
  • MS Thesis Defense: Annalisa Jarecki March 10, 2023
  • MS Thesis Defense: Elisabeth Ford December 13, 2022
  • PhD Defense: Chuanqi Zheng December 9, 2022

© 2016–2026 Adaptive Robotics & Technology Lab Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment