Posthuman Mobility research team develops design methodologies for Cyborg Natures!
The Posthuman Mobility research team was formed as part of the Hyundai Motor Group and Rhode Island School of Design research collaboration, which examines relationships among natural and built environments to propose new directions for the future of mobility. Led by Anastasiia Raina (Assistant Professor, GD), an interdisciplinary group of RISD students: Georgina Nolan (GD, MFA 2021), Danlei Huang (ID, MFA 2021), Yimei Hu (Jewelry + Metalsmithing, ID BFA, 2021), Meredith Binnette (FVA, BFA, 2020) created an immersive project titled Microbial Cosmologies. Responding to the evolving pandemic, the research team explored the future of mobility in a microbe centric world by addressing the relationship between humans and microbes and our need to respond and adapt with agility to the new world around us. The research team developed models for collaboration with nature as an alternative to models inspired by nature. They focused on interspecies collaboration, cyborg nature, and designed nature, such as future mobility hubs and human identification methods utilizing speculative design, industrial design, augmented reality, virtual reality, and machine-learning technologies.
What does it mean to be inspired by Nature when we live in artificial and cyborg landscapes and ecosystems? What new design methodologies can we develop in this techno-ecology? The Posthuman Mobility Lab critically explored the process of nature simulation in art, design, and technology during a six-week RISD-Hyundai Summer Collaborative Research project led by Anastasiia Raina Five RISD graduate and undergraduate students: Qihang Li (Fine Art, BFA 2020) Zack Davey (Fine Art, BFA 2020), Yimei Hu (Jewelry + Metalsmithing, ID BFA, 2021), Meredith Binnette (FVA, BFA, 2020), Danlei Huang (ID, MFA 2021) focused on one fascinating question—how is form generated when the boundary between the natural and the automated becomes obsolete?
The research team explored the evolutionary development process and environmental adaptation that results in unique capabilities and morphologies in plant, insect, and animal design. Our research employed style transfer and image synthesis capabilities of Generative Adversarial Network algorithms to analyze and combine the visual data into custom-generated models. We worked closely with AI researcher Lia Coleman, who introduced us to Generative Adversarial Network(GAN) algorithms and guided us through collecting a dataset, creating custom GAN models, and manipulating models in latent space. The process of training a StyleGAN model, generated images, and videos served as sources and material for our project titled From Chaos to Order, a simulated environment that stimulates the growth and morphology of techno-ecological embryonic specimens.
In short, we translated sound into xyz coordinates and used them to grow abstract forms that were generated by StyleGan2 (Machine learning algorithm), that is why process videos in this project are more important than the final outcome.