Robot development workflows rely on simulation for testing and training, and we want to show you how roboticists can use Unity for robotics simulation.
Our new guided learning experiences help you create a portfolio, get Unity Certified, and prepare for a new job.
The Unity Burst Compiler transforms your C# code into highly optimized machine code. One question that we get often from our amazing forum users like @dreamingimlatios surrounds in parameters to functions within Burst code. Should developers use them and where? We’ve put together this post to try and explain them a bit more in detail.
Within Eidos Labs, several projects use machine learning. The Automated Game Testing project tackles the problem of testing the functionality of expansive AAA games by modeling player behavior with agents that have learned behavior using reinforcement learning (RL). In this blog post, we’ll describe how the team at Eidos Labs created the Grid Sensor within the Unity Machine Learning Agents Toolkit (ML-Agents) to better represent the game for machine learning, improving training times and ultimately leading to less expensive models.
Deep Learning is now powering numerous AI technologies in daily life, and convolutional neural networks (CNNs) can apply complex treatments to images at high speeds. At Unity, we aim to propose seamless integration of CNN inference in the 3D rendering pipeline. Unity Labs, therefore, works on improving state-of-the-art research and developing an efficient neural inference engine called Barracuda. In this post, we experiment with a challenging use case: multi-style in-game style transfer. Deep learning has long been confined to supercomputers and offline computation, but their usability at real-time on consumer hardware is fast approaching thanks to ever-increasing compute capability. With Barracuda, Unity Labs hopes to accelerate its arrival in creators’ hands. While neural networks are already being used for game AI thanks to ML-Agents, there are many applications to rendering which have yet to be demonstrated in real-time game engines. For example: deep-learned supersampling, ambient occlusion, global illumination, style transfer, etc. We chose the latter to demonstrate the full pipeline going from training the network to integration in Unity’s rendering loop.
It’s easy to automate playtesting by creating a Virtual Player (a game-playing agent), then using Game Simulation to run automated playtests with your Virtual Player at scale.
With the Input System, you can quickly set up controls for multiple platforms, from mobile to VR. Get started with our example projects and new video tutorials for beginners and intermediate users.
Input is at the heart of what makes your real-time projects interactive. Unity’s system for input standardizes the way you implement controls and provides new advanced functionality. It’s verified for Unity 2019 LTS and newer versions (see the documentation for a full list of supported input devices).
Our new tutorial content can help you to get started quickly, even if you’re completely new to developing for multiple platforms. If you’re already familiar with the workflows, learn more about how to use the Input System to drive (other) Unity tools like Cinemachine or Unity UI with Warriors, our main example project.
This Meet the Devs session from March explains why we created this new system and includes a demo that outlines workflows for setting up local multiplayer, quickly adding gamepad controls, spawning new players, and implementing mobile controls. Rene Damm, the lead developer of the Input System, also answers questions from the audience about tooling and the team’s roadmap.
With more than 400 improvements, Unity 2020.2 TECH Stream continues our commitment to workflow, stability, and performance enhancements. Check out some of the key highlights below.
We had plans for 2020. We were going to do things better to make Unity better for you. And we did. We reexamined our priorities. We listened to you. We committed to improving performance and quality of life for all users – so you can bring your vision to life faster.
Unity 2020.2 TECH Stream is packed with all the latest features for those with projects in pre-production, or simply for those who want to leverage the most cutting edge tech to achieve a competitive edge. This version of Unity also ensures a smooth upgrade path forward. To get started, download it here today.
Following up on our promise to improve your development experience, in 2020 we shifted our release philosophy. We prioritized quality over quantity and reduced the number of releases to two per year, giving our engineers an extended stabilization phase.
If you’re about to lock in your production on a specific version of Unity, for maximum stability we always recommend you use the latest Long-Term Support (LTS) version of Unity (that’s why it’s the default download in the Unity Hub), currently Unity 2019 LTS. Unity 2020 LTS, which will include the feature set we’re summarizing below plus further stabilization and bug fixes, will be available in spring 2021.
Learn how Immersion leveraged Unity to quickly create and deploy an interactive training for healthcare workers fighting COVID-19 on the frontlines. Read More manufacturing | Unity Technologies Blog
Unity powers Daimler to create mixed reality experience across the automotive lifecycle Read More manufacturing | Unity Technologies Blog