During a week workshop given by Cyril Diagne, second year students explored the integration of machine learning tools in their creative process.
By limiting the coding step in favour of using the concept of Prompt they experimented with Diffusion Models such as GPT3, Clip or DALL-E to create texts, images and videos.
Comparing the way our brain seems to make our dreams and the way some AI models work, Elina Crespi used some Diffusion Model to represent her dreams.
Workshop (2022) with Cyril Diagne
Comparing the way our brain seems to make our dreams and the way some AI models work, Elina Crespi used some Diffusion Model to represent her dreams.
By Elina Crespi
Tickie created a dialogue with the AI to ask it what a utopian city would be like. She collected the conversations and images and archived them into a website.
By working hand in hand with several AIs (for the visuals, the music, the voice-over and the subtitles), Adryan Barilliet created a new mythology narrating the creation of a world after the fusion between divine and machine.
Niki created an interactive quiz in which the user must find the French expression represented by the images.
By Niki Zaal
After Midnight shows what the minutes, hours, days, months and years after the end of the world might look like. In 2021, the “Doomsay Clock” in the Bulletin of Atomic Scientists shows that we are 100 seconds away from a possible end of the world on a 24-hour scale. Arthur Lucchesi imagined what the end of the world might look like after that time.
Jeanne used machine learning to give her access to universes in her favorite books that are poorly documented visually.
With text inputs, she explored different environments like the forest, the ocean and the sky to create animated lock screens for smartphones.
By Jeanne Weber