Magritte VR

A Virtual Reality experience of René Magritte’s La Vengeance

May 2022

Team:

  • Ivery - Texturing, modelling, lighting interior set, programming

  • Sarah - Texturing, modelling, lighting exterior set

  • Katrina - Research and essay

  • Abhinav - Programming

  • Aidan - Sound Design

Software used:

  • Unity

  • Blender

  • Oculus Desktop

  • Photoshop

Outline:

This is a 3 week in-class group project on the topic of “Simulating Reality” at Brown University. I pitched the idea and took lead in modelling, texturing, setting up the project, programming and integrating VR controllers with our game mechanics.

My original pitch:

A 3D rendition of Magritte’s Painting, La Vengeance, in VR. There is nothing more challenging than recreating the works of Surrealist painters in 3D, as surrealism is meant to depict illogical, unrealistic scenes.

Refined pitch (after forming a group of five)

Magritte questions the status of representation in his paintings. This is a VR experience that allows users to experience both possible realities of his intentions for this painting.

Experience:

The viewer will be able to walk around the room and enter the painting within the painting, a second ‘reality’. The user is able to switch between a landscape painting + cloud wallpaper to real mountains and real clouds.

Modelling:

Clouds ( assets created by Sarah Jihyun Woo)

Mountains ( assets created by Sarah Jihyun Woo)

Room + Painting + Interior (assets created by myself)

Texturing + Shading:

I created these paint textures in Photoshop and Blender to mimic the texture of the original painting. Some of the textures were done using Image Projection (such as the metal ball).

One of the biggest challenges was creating the shader for the 3D scene inside the painting, so that only the parts contained inside the painting is shown, and the rest hidden.

We found existing resource online on Stencil Buffer, adapted them and wrote our own shaders that preserves the texture of the objects but also allow them to “disappear” outside of certain boundaries. We figured our variations of the script for the easle, clouds, mountains, painting frame, which were all using slightly different combinations of code.

Shaders we created with scripts

Equipments:

We initially had access only to an HTC Vive, but after setting our project up, we realised that there were many limitations.

The Space Limitation

  • The roof in the model is low and painting appears small because the physical room is small.

  • Limited to this space because we are using the Vive headset plugged into a PC at Fulvio’s lab.

  • We are able to touch two physical walls but not the other two.

  • We could possibly set dress the space so people can feel more with their hands while exploring the space - prop walls, easel and canvas? does that take away or add something?

Programming Limitation

  • After setting up, we realised that it is very difficult to get the users’ eye-line to line up with the exact point we would want them to for a seamless transition between the 3D clouds (real) and projected clouds (painting).

  • We decided to use a button that allows users to click to switch between the 2 forms.

  • Alternatively, we could teleport people back to the exact point and remove the walking component, but we think what makes this project more immersive is the ability to walk around the room.

Shadows

  • It looked strange that when we look down there is no body or shadow.

We eventually got access to an Oculus Quest 2 and used it for our project.

Outcome:

Below are side-by-side comparison images of the projected 2D world and the 3D world. The switch is triggered by pressing the buttons on the sides of both controllers to switch back and forth. The switch is also accompanied by music that switches on and off.

Left: Fake and projected 2D world

Right: Real 3D world