Best practices for bringing AR applications to the field
Posted in TechnologyLearn how industrial giant ABB is using Unity and augmented reality to transform field […]
Continue reading "Best practices for bringing AR applications to the field"Learn how industrial giant ABB is using Unity and augmented reality to transform field […]
Continue reading "Best practices for bringing AR applications to the field"Robot development workflows rely on simulation for testing and training, and we want to show you how roboticists can use Unity for robotics simulation.
Continue reading "Robotics simulation in Unity is as easy as 1, 2, 3!"Our new guided learning experiences help you create a portfolio, get Unity Certified, and prepare for a new job.
Continue reading "New career pathways help you break into the gaming and tech industries"The Unity Burst Compiler transforms your C# code into highly optimized machine code. One question that we get often from our amazing forum users like @dreamingimlatios surrounds in parameters to functions within Burst code. Should developers use them and where? We’ve put together this post to try and explain them a bit more in detail.
Continue reading "In parameters in Burst"Within Eidos Labs, several projects use machine learning. The Automated Game Testing project tackles the problem of testing the functionality of expansive AAA games by modeling player behavior with agents that have learned behavior using reinforcement learning (RL). In this blog post, we’ll describe how the team at Eidos Labs created the Grid Sensor within the Unity Machine Learning Agents Toolkit (ML-Agents) to better represent the game for machine learning, improving training times and ultimately leading to less expensive models.
Continue reading "How Eidos-Montréal created Grid Sensors to improve observations for training agents"Deep Learning is now powering numerous AI technologies in daily life, and convolutional neural networks (CNNs) can apply complex treatments to images at high speeds. At Unity, we aim to propose seamless integration of CNN inference in the 3D rendering pipeline. Unity Labs, therefore, works on improving state-of-the-art research and developing an efficient neural inference engine called Barracuda. In this post, we experiment with a challenging use case: multi-style in-game style transfer. Deep learning has long been confined to supercomputers and offline computation, but their usability at real-time on consumer hardware is fast approaching thanks to ever-increasing compute capability. With Barracuda, Unity Labs hopes to accelerate its arrival in creators’ hands. While neural networks are already being used for game AI thanks to ML-Agents, there are many applications to rendering which have yet to be demonstrated in real-time game engines. For example: deep-learned supersampling, ambient occlusion, global illumination, style transfer, etc. We chose the latter to demonstrate the full pipeline going from training the network to integration in Unity’s rendering loop.
Continue reading "Real-time style transfer in Unity using deep neural networks"