Unreal Engine. Jamie Mossahebi.
This lecture was postponed a couple of times but eventually happened one Thursday evening. It was a guest lecture by Jamie Mossahebi working for Unreal Engine, Epic Games.
Jamie talked about his work experience but also gave us some tips related to job hunting and working in the industry.
360-degree videos was the first point of his lecture. This form of capturing images is still quite a new format despite being around for a while now. 360-degree videos can be recorded using a rig of multiple cameras (placed to cover 360 degrees) or a dedicated camera with multiple lenses embedded and recording overlapping images at the same time.
Jamie mentioned virtual captures where the user can move around and see some cool stuff, as examples of 360-degree videos available to watch on YouTube.
Jamie’s own experience with recording 360-videos was quite challenging, with no high quality equipment at first, where he and his colleagues experimented a lot to achieve desired results. He said that they started with just one camera being turned around to get full 360 degrees capture. This action had its limitations as it produced images, not videos; action cameras were needed. They used 14 camera rigs, it was better but also has limitations, Jamie said.
The process of producing 360-degree videos contains capturing images, pivot point (algorithm projection), output – flat image (stitching, Lincy), output projected on a sphere (Polar Projection).
Stereoscopic render – left eye and right eye separate; both Unity and Unreal use it.
What had to be considered was that the cameras had to be placed as close as possible to each other, and it was not always possible to achieve; it is easier now as technology has developed.
Jamie mentioned the project in NYC he produced with his friend. It was a virtual footage of NYC (in VR headset) where the player would eat (in real life) a subway sandwich sitting in a car in London, UK, having an experience of being on a taxi through NYC.
Jamie experimented with recording videos a lot. He and his friends attached cameras to boats, cars, drones, worked on sports events in 2016, with some high end equipment such as Nokia OZO.
Insta Pro 30 and Go Pro are the currently accessible 360-degree cameras available on the market, suitable for casual users but also for professional productions. It does not mean that this type of capture videos is not high end anymore, there are still some serious 360 productions involving a lot of professional equipment and processing.Premier and YouTube have tools for post processing 360-degree videos.
Next, Jamie talked to us about the Unreal Engine. He started with pointing out differences between rendering frames in games and in movies, while in games as we have limited rendering resources, in the movies we can spend all day rendering one frame to achieve the best possible quality. The situation has changed and there is high fidelity in games now.
Epic games is the company that builds games in the Unreal Engine (Fortnite – the biggest existing game currently), but also helps creators and offers a free to download game engine called Unreal Engine, with the same tools as Epic uses in their production environment. I know a little about Unreal due to our two introductory sessions with Herman Ho this term. I have already set up a task for myself for this year’s summer holidays to learn how to work in this game engine.
Unreal has a motto: one asset, many uses: still images, linear output, virtual production, interactive experience, immersive experience, augmented reality and mixed reality experience.
A comprehensive concept of virtual production offering more flexibility in building the sets, a holistic approach and multiuse of assets.
Virtual build in Unreal means virtual production – Jamie presented the example using LED screen walls, cameras and rendering just the necessary piece in the highest possible quality.
Virtual production is actually an umbrella term, referring also to animation (an example of Weta Digital company production).
Development of virtual production not only leads towards the complete removal of a green screen, but also, is an important tool in environment-conscious productions as it reduces carbon footprint dramatically (no need to travel around the world to get specific captures). In the long run, it also saves money.
Recently, big global companies such as Netflix and Google are working on virtual content and next generation gaming consoles using virtual production elements.
MetaHuman creator, is a web, pixel-streaming based tool to create human assets/3d models that can be downloaded to Maya, further processed and next used in the game engine. Jamie shared with us a short movie released for Halloween using MetaHuman models, Unreal engine and MoCap. Last term we had a chance to test out MetaHuman as we had one session dedicated to work on that tool. Unreal offers a wide range of tools for making virtual productions easier and more efficient. There is also a feature allowing to track faces using a mobile phone.
At the end of the lecture, our guest talked to us about working in the industry. He said that in real-life you are never a specialist in every aspect of the software/engine/field and, therefore, not being an expert should not be the reason for us not applying for a job. It is great to have knowledge and a focus but no one’s expecting we know everything.
A tip Jamie gave us was to get ourselves familiar with the Unreal Engine as there is a shortage of specialists working in this engine and at some point of our careers we will be asked to use it for developing games or experiences.
Jamie also offered his help if we want some advice on job searching in the gaming industry or our job hunting documents.
It was an informative meeting, inspiring me to study and get familiar with the technology and tools I don’t really know that much. It also reassured me that I made the right choice to study VR and choose it as my future career as it is an ever-developing field with an enormous potential.