News

How Eidos-Montréal created Grid Sensors to improve observations for training agents

Posted in Technology

Within Eidos Labs, several projects use machine learning. The Automated Game Testing project tackles the problem of testing the functionality of expansive AAA games by modeling player behavior with agents that have learned behavior using reinforcement learning (RL).  In this blog post, we’ll describe how the team at Eidos Labs created the Grid Sensor within the Unity Machine Learning Agents Toolkit (ML-Agents) to better represent the game for machine learning, improving training times and ultimately leading to less expensive models.

Continue reading "How Eidos-Montréal created Grid Sensors to improve observations for training agents"

Real-time style transfer in Unity using deep neural networks

Posted in Technology

Deep Learning is now powering numerous AI technologies in daily life, and convolutional neural networks (CNNs) can apply complex treatments to images at high speeds. At Unity, we aim to propose seamless integration of CNN inference in the 3D rendering pipeline. Unity Labs, therefore, works on improving state-of-the-art research and developing an efficient neural inference engine called Barracuda. In this post, we experiment with a challenging use case: multi-style in-game style transfer. Deep learning has long been confined to supercomputers and offline computation, but their usability at real-time on consumer hardware is fast approaching thanks to ever-increasing compute capability. With Barracuda, Unity Labs hopes to accelerate its arrival in creators’ hands. While neural networks are already being used for game AI thanks to ML-Agents, there are many applications to rendering which have yet to be demonstrated in real-time game engines. For example: deep-learned supersampling, ambient occlusion, global illumination, style transfer, etc. We chose the latter to demonstrate the full pipeline going from training the network to integration in Unity’s rendering loop.

Continue reading "Real-time style transfer in Unity using deep neural networks"

Learn the Input System with updated tutorials and our sample project, Warriors

Posted in Technology

With the Input System, you can quickly set up controls for multiple platforms, from mobile to VR. Get started with our example projects and new video tutorials for beginners and intermediate users.

Input is at the heart of what makes your real-time projects interactive. Unity’s system for input standardizes the way you implement controls and provides new advanced functionality. It’s verified for Unity 2019 LTS and newer versions (see the documentation for a full list of supported input devices). 

Our new tutorial content can help you to get started quickly, even if you’re completely new to developing for multiple platforms. If you’re already familiar with the workflows, learn more about how to use the Input System to drive (other) Unity tools like Cinemachine or Unity UI with Warriors, our main example project.

This Meet the Devs session from March explains why we created this new system and includes  a demo that outlines workflows for setting up local multiplayer, quickly adding gamepad controls, spawning new players, and implementing mobile controls. Rene Damm, the lead developer of the Input System, also answers questions from the audience about tooling and the team’s roadmap.

Continue reading "Learn the Input System with updated tutorials and our sample project, Warriors"