Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - Game Development

93 Articles
article-image-unreal-engine-4-20-released-with-focus-on-mobile-and-immersive-ar-vr-mr-devices
Sugandha Lahoti
20 Jul 2018
4 min read
Save for later

Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices

Sugandha Lahoti
20 Jul 2018
4 min read
Following the release of Unreal Engine 4.19 this April, Epic games have launched the Unreal Engine 4.20. This major update focuses on enhancing scalability and creativity, helping developers create more realistic characters, and immersive environments, for games, film, TV, and VR/AR devices. Multiple optimizations for Mobile Game development Epic games brought over 100 optimizations created for Fortnite on iOS and Android, for Unreal Engine 4.20. Hardware Occlusion Queries are now supported for high-end mobile devices on iOS and Android that support ES 3.1 or Vulkan using the GPU. Developers can also iterate and debug on Android without having to repackage the UE4 project. Game developers now have unlimited Landscape Material layers on mobile devices. Mixed Reality Capture Unreal Engine 4.20 provides a new Mixed Reality Capture functionality, which makes it easy to composite real players into a virtual space for mixed reality applications. It has three components: video input, calibration, and in-game compositing. You can use supported webcams and HDMI capture devices to pull real-world green-screened video into the Unreal Engine from a variety of sources.  The setup and calibration are done through a standalone calibration tool that can be reused across Unreal Engine 4 titles. Niagara Visual effects editor The Niagara visual effects Editor is available as an early access plugin. While the Niagara editor builds on the same particle manipulation methods of Cascade (UE4’s previous VFX), unlike Cascade, Niagara is fully Modular. UE 4.20 adds multiple improvements to Niagara Effect Design and Creation. All of Niagara’s Modules have been updated to support commonly used behaviors in building effects for games. New UI features have also been added for the Niagara stack that mimic the options developers have with UProperties in C++. Niagara now has support for GPU Simulation when used on DX11, PS4, Xbox One, OpenGL (ES3.1), and Metal platforms. Niagara CPU Simulation now works on PC, PS4, Xbox One, OpenGL (ES3.1) and Metal. Niagara was showcased at the GDC 2018 and you can see the presentation Programmable VFX with Unreal Engine’s Niagara for a complete overview. Cinematic Depth of Field Unreal Engine 4.20 also adds Cinematic Depth of Field, where developers can achieve cinema quality camera effects in real-time. Cinematic DoF, provides cleaner depth of field effect providing a cinematic appearance with the use of a procedural Bokeh simulation. It also features dynamic resolution stability, supports alpha channel, and includes settings to scale it down for console projects. For additional information, you can see the Depth of Field documentation. Proxy LOD improvements The Proxy LOD tool is now production-ready. This tool improves performance by reducing rendering cost due to poly count, draw calls, and material complexity. It  results in significant gains when developing for mobile and console platforms. The production-ready version of the Proxy LOD tool has several enhancements over the Experimental version found in UE4.19. Improved Normal Control: The use may now supply the hard-edge cutoff angle and the method used in computing the vertex normal. Gap Filling: The Proxy system automatically discards any inaccessible structures. Gap Filling results in fewer total triangles and a better use of the limited texture resource. Magic Leap One Early Access Support With Unreal Engine 4.20, game developers can now build for Magic Leap One. Unreal Engine 4 support for Magic Leap One uses built-in UE4 frameworks such as camera control, world meshing, motion controllers, and forward and deferred rendering. For developers with access to hardware, Unreal Engine 4.20 can deploy and run on the device in addition to supporting Zero Iteration workflows through Play In Editor. Read more The hype behind Magic Leap’s New Augmented Reality Headsets Magic Leap’s first AR headset, powered by Nvidia Tegra X2, is coming this Summer Apple ARKit 2.0 and Google ARCore 1.2 Support Unreal Engine 4.20 adds support for Apple’s ARKit 2.0, for better tracking quality, support for vertical plane detection, face tracking, 2D and 3D image detection, and persistent and shared AR experiences. It also adds support for Google’s ARCore 1.2, including vertical plane detection, Augmented Images, and Cloud Anchor to build collaborative AR experiences. These are just a select few updates to the Unreal Engine. The full list of release notes is available on the Unreal Engine blog. What’s new in Unreal Engine 4.19? Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 1
  • 21236

article-image-obstacle-tower-environment-2-0-unity-announces-round-2-of-its-obstacle-tower-challenge-to-test-ai-game-players
Sugandha Lahoti
15 May 2019
2 min read
Save for later

Obstacle Tower Environment 2.0: Unity announces Round 2 of its ‘Obstacle Tower Challenge’ to test AI game players

Sugandha Lahoti
15 May 2019
2 min read
At the end of January, Unity announced the ‘Obstacle Tower Challenge’ to test AI game players. The Obstacle Tower Challenge examines how AI software performs in computer vision, locomotion skills, and high-level planning. The challenge began on 11th February and will run through 24th May. Round 1 ran from 11th Feb till 31st March and the results are just in. For the first round of the challenge, Unity received 2000+ entries from 350+ teams. Now, Unity has announced the launch of the second round of the challenge. Teams who trained an agent in round one and received an average score of five on unseen versions of the tower will advance for round 2. Agents will need to account for a variety of new challenges in Obstacle Tower Environment 2.0 including enemies to dodge, distractions to avoid, and more complicated floor layouts with circling paths. What’s new in the Obstacle Tower Environment 2.0? Unity has expanded the floors in the tower from 25 to 100 with three new visual styles - Industrial, Modern, and Future. The higher floors also contain new challenges apart from the ones already present such as enemies to dodge, distracting TVs to avoid, more complex floor layouts with circling paths, and larger rooms on each floor with additional platforming challenges. Obstacle Tower Environment 2.0 has expanded on the number of available parameters which can be customized when resetting the environment. These include the ability to change things like the lighting, visual theme, floor layouts, and room contents on the floors in the tower. They have also worked on the placement of the reset button in puzzle rooms which, based on feedback from round 1, was unintuitive. So Unity has now separated out the block, goal, and reset button positions in these rooms, to make it less likely that the agent will press the reset button by accident. The Obstacle Tower Environment natively supports the Unity ML-Agents Toolkit. To learn more about the environment, you can go through their research paper. Unity has also released the final list of contestants selected for Round 2. Unity has launched the ‘Obstacle Tower Challenge’ to test AI game players Unity updates its TOS, developers can now use any third party service that integrate into Unity. Improbable says Unity blocked SpatialOS; Unity responds saying it has shut down Improbable and not Spatial OS.
Read more
  • 0
  • 0
  • 21171

article-image-meet-yuzu-an-experimental-emulator-for-the-nintendo-switch
Sugandha Lahoti
17 Jul 2018
3 min read
Save for later

Meet yuzu – an experimental emulator for the Nintendo Switch

Sugandha Lahoti
17 Jul 2018
3 min read
The makers of Citra, an emulator for the Nintendo 3DS, have released a new emulator called yuzu. This emulator is made for the Nintendo Switch, which is the 7th major video game console from Nintendo. The journey so far for yuzu Yuzu was initiated as an experimental setup by Citra’s lead developer bunnei after he saw that there were signs of the Switch’s operating system being based on the 3DS’s operating system. yuzu has the same core code as Citra and much of the same OS High-Level Emulation (HLE). The core emulation and memory management of yuzu are based on Citra, albeit modified to work with 64-bit addresses. It also has a loader for the Switch games and Unicorn integration for CPU emulation. Yuzu uses Reverse Engineering process to figure out how games work, and how the Switch GPU works. Switch’s GPU is more advanced than 3DS’ used in Citra and poses multiple challenges to reverse engineer it. However, the RE process of yuzu is essentially the same as Citra. Most of their RE and other development is being done in a trial-and-error manner. OS emulation The Switch’s OS is based Nintendo 3DS’s OS. So the developers used a large part of Citra’s OS HLE code for yuzu OS. The loader and file system service was reused from Citra and modified to support Switch game dump files. The Kernel OS threading, scheduling, and synchronization fixes for yuzu were also ported from Citra’s OS implementation. The save data functionality, which allowed games to read and write files to the save data directory was also taken from 3DS. Switchbrew helped them create libnx, a userland library to write homebrew apps for the Nintendo Switch. (Homebrew is a popular term used for applications that are created and executed on a video game console by hackers, programmers, developers, and consumers.) The Switch IPC (Inter-process communication) process is much more robust and complicated than the 3DS’s. Their system has different command modes, a typical IPC request response, and a Domain to efficiently conduct multiple service calls. Yuzu uses the Nvidia services to configure the video driver to get the graphics output. However, Nintendo re-purposed the Android graphics stack and used it in the Switch for rendering. And so yuzu developers had to implement this even to get homebrew applications to display graphics. The Next Steps Being at a nascent stage, yuzu still has a long way to go. The developers still have to add HID (user input support) such as support for all 9 controllers, rumble, LEDs, layouts etc. Currently, the Audio HLE is in progress, but they still have to implement audio playback. Audio playback, if implemented properly, would be a major breakthrough as most complicated games often hang or go into a deadlock because of this issue. They are also working on resolving minor fixes to help them boot further in games like Super Mario Odyssey, 1-2-Switch, and The Binding of Issac. Be sure to read the entire progress report on the yuzu blog. AI for game developers: 7 ways AI can take your game to the next level AI for Unity game developers: How to emulate real-world senses in your NPC agent behavior Unity 2018.2: Unity release for this year 2nd time in a row!
Read more
  • 0
  • 0
  • 21148

article-image-google-announces-stadia-a-cloud-based-game-streaming-service-at-gdc-2019
Bhagyashree R
20 Mar 2019
3 min read
Save for later

Google announces Stadia, a cloud-based game streaming service, at GDC 2019

Bhagyashree R
20 Mar 2019
3 min read
Yesterday, at the ongoing Game Developers Conference (GDC), Google marked its entry in the game industry with Stadia, its new cloud-based platform for streaming games. It will be launching later this year in select countries including the U.S., Canada, U.K., and Europe. https://twitter.com/GoogleStadia/status/1108097130147860480 GDC 2019 is a five-day event, which commenced on 18th of this month at San Francisco, CA. It is the world’s largest game industry event which brings together 28,000 attendees to share ideas and discuss the future of the gaming industry. What is Stadia? Phil Harrison, Google’s Vice President, and GM, announcing the game streaming platform said, “Our ambition is far beyond a single game. The power of instant access is magical, and it's already transformed the music and movie industries." Stadia is a cloud-based game streaming platform that aims to bring together, gamers, YouTube broadcasters, and game developers “to create a new experience”. The games get streamed from any data center to any device that can connect to the internet like TV, laptop, desktop, tablet, or mobile phone. With this, gamers will be able to access their games anytime and virtually on any screen. And, game developers will be able to use nearly unlimited resources for developing games. Since all the graphics processing happens on off-site hardware, there will be little stress on your local hardware. The demo that Google shared at GDC currently streams video at 1080p, 60 frames per second. At launch, Stadia will come with up to 4K resolution and 60 frames per second with approximately 25Mbps of bandwidth. In the future, Google is planning to offer 8K resolution and 120 frames per second. Google, in partnership with AMD, is building a custom GPU for its data centers, which will deliver 10.7 teraflops of power.  Also, each Stadia instance will be powered by a custom 2.7GHz x86 processor with 16GB of RAM. Stadia Controller At GDC, Google also talked about a dedicated controller for Stadia that directly connects to a game session in the cloud through WiFi. The controller provides a button for capturing, saving, and sharing gameplay in up to 4K resolution. It also comes integrated with Google Assistant and a built-in microphone. According to a blog post shared by Google, it is not guaranteed that the controller will be offered for sale as the device is not yet authorized by the Federal Communications Commission. While unveiling the game-streaming service, Google did not reveal any details on the pricing. Also, the details regarding when exactly we can expect this service to reach the gamers and developers are unknown. To know more in detail about Stadia, check out the official announcement on Google’s blog post. Google is planning to bring Node.js support to Fuchsia Google to be the founding member of CDF (Continuous Delivery Foundation) Google announces the stable release of Android Jetpack Navigation
Read more
  • 0
  • 0
  • 20265

article-image-unreal-engine-4-22-update-support-added-for-microsofts-directx-raytracing-dxr
Melisha Dsouza
15 Feb 2019
3 min read
Save for later

Unreal Engine 4.22 update: support added for Microsoft’s DirectX Raytracing (DXR)

Melisha Dsouza
15 Feb 2019
3 min read
On 12th February, Epic Games released a preview build of Unreal Engine 4.22, and a major upgrade among numerous other features and fixes is the support for real-time ray tracing and path tracing. The new build will extend its preliminary support for Microsoft's DirectX Ray-tracing (DXR) extensions to the DirectX 12 API. Developers can now try their hands at ray-traced games developed through Unreal Engine 4. There are very limited games that support raytracing. Currently, only  Battlefield V (Ray Traced Reflections) and Metro Exodus (Ray Traced Global Illumination) feature ray tracing effects, which are developed in the proprietary Frostbite 3 and 4A Game Engines. [box type="shadow" align="" class="" width=""]Fun Fact: Ray tracing is a much more advanced and lifelike way of rendering light and shadows in a scene. Movies and TV shows use this to create and blend in amazing CG work with real-life scenes leading to more life-like, interactive and immersive game worlds with more realistic lighting, shadows, and materials.[/box] The patch notes released by the team states that they have added low level support for ray tracing: Added ray tracing low-level support. Implemented a low-level layer on top of UE DirectX 12 that provides support for DXR and allows creating and using ray tracing shaders (ray generation shaders, hit shaders, etc) to add ray tracing effects. Added high-level ray tracing features Rect area lights Soft shadows Reflections Reflected shadows Ambient occlusion RTGI (ray traced global illumination) Translucency Clearcoat IBL Sky Geometry types Triangle meshes Static Skeletal (Morph targets & Skin cache) Niagara particles support Texture LOD Denoiser Shadows, Reflections, AO Path Tracert Unbiased, full GI path tracer for making ground truth reference renders inside UE4. According to HardOCP,  the feature isn't technically tied to Nvidia RTX but since turing cards are the only ones with driver support for DirectX Raytracing at the moment, developers need an RTX 2000 series GPU to test out Unreal's Raytracing. There has been much debate about the RTX offered by NVIDIA in the past. While the concept did sound interesting at the beginning, very few engines adopted the idea- simply because previous generation processors cannot support all the features of NVIDIA’s RTX. Now, with DXR in the picture, It will be interesting to see the outcome of games developed using ray tracing. Head over to Unreal Engine’s official post to know more about this news. Implementing an AI in Unreal Engine 4 with AI Perception components [Tutorial] Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 0
  • 20091

article-image-introducing-script-8-an-8-bit-javascript-based-fantasy-computer-to-make-retro-looking-games
Bhagyashree R
28 Jan 2019
2 min read
Save for later

Introducing SCRIPT-8, an 8-bit JavaScript-based fantasy computer to make retro-looking games

Bhagyashree R
28 Jan 2019
2 min read
Adding to the list of several fantasy consoles/computers is the newly-introduced SCRIPT-8, written by Gabriel Florit, a graphics reporter at the Washington Post who also likes working with augmented reality. https://twitter.com/gabrielflorit/status/986716413254610944 SCRIPT-8 is a JavaScript-based fantasy computer to make, play, and, share tiny retro-looking games. Based on Bret Victor’s Inventing on principle and Learnable programming, it provides programmers live-coding experience, which means the program’s output updates as they code. The games built using SCRIPT-8 are called cassettes. These cassettes are recorded at a URL which you can share with anyone and play with a keyboard or gamepad. You can also make your own version of an existing cassette by changing its code, art, or music, and record it to a different cassette. What are SCRIPT-8’s features? A code editor which provides you with immediate feedback. A slider using which you can easily update numbers without typing. A time-traveling tool for pausing and rewinding the game. You can see a character’s past and future paths with provided buttons. A sprite editor where the changes are reflected in the game instantly. A map editor to create new paths. A music editor using which you can create phrases, group them into chains, and turn those into songs. You can read more about SCRIPT-8 on its website. Google DeepMind’s AI AlphaStar beats StarCraft II pros TLO and MaNa; wins 10-1 against the gamers Fortnite server suffered a minor outage, Epic Games was quick to address the issue Deepmind’s AlphaZero shows unprecedented growth in AI, masters 3 different games
Read more
  • 0
  • 0
  • 19527
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-microsoft-announces-game-stack-with-xbox-live-integration-to-android-and-ios
Natasha Mathur
15 Mar 2019
3 min read
Save for later

Microsoft announces Game stack with Xbox Live integration to Android and iOS

Natasha Mathur
15 Mar 2019
3 min read
Microsoft has good news for all the game developers out there. It launched a new initiative, called Microsoft Game Stack yesterday, which includes an amalgamation of different Microsoft tools and services into a single robust ecosystem to ‘empower game developers’. It doesn’t matter whether you’re a rookie indie developer or an AAA studio, this developer-focused platform will make the game development process ten times easier for you. The main goal of Game Stack is to help developers easily find different tools and services required for game development at one spot. These tools range from Azure, PlayFab, DirectX, Visual Studio, to  Xbox Live, App Center, and Havok. Cloud plays a major role in Game Stack and it makes use of Azure to fulfill this requirement. Source: Microsoft Azure is globally available in 54 regions will help scale its Project xCloud (a service that streams games to PCs, consoles, and mobile devices) to provide an uninterrupted gaming experience for players worldwide. Not to forget, companies like Rare, Ubisoft and Wizards of the Coast are already hosting multiplayer game servers and storing their player data on Azure. It is also capable of analyzing game telemetry, protecting games from DDOS attacks, and training AI. Moreover, Microsoft Game Stack is device agnostic which makes it really convenient for the gamers. Another great component of Game Stack is a backend service for operating and building new games, called PlayFab. PlayFab offers game development services, real-time analytics, and LiveOps capabilities to Game Stack. PlayFab is also device agnostic. It supports iOS, Android, PC, Web, Xbox, Sony PlayStation, Nintendo Switch and all the other major game engines such as Unity and Unreal. Microsoft has also released a preview for five new PlayFab services. Out of these five, one, called, PlayFab Matchmaking is open for public preview, while other four including PlayFab Party, PlayFab Insights, PlayFab PubSub, and PlayFab user-generated Content are in private preview. Game Stack also comes with Xbox Live, one of the most engaging and interactive gaming communities in the world. Xbox Live will be providing identity and community services in the Game Stack. Microsoft has also expanded the cross-platform capabilities of Xbox Live under Game Stack with a new SDK for iOS and Android devices. Mobile developers will be able to easily connect with some of the most highly engaged and passionate gamers on the planet using Xbox Live. Other benefits of the Xbox Live SDK includes more focus on building games and leveraging Microsoft‘s trusted identity network to offer support for log-in, privacy, online safety and child accounts. Apart from that, there are features like gamerscore, and “hero” stats, that help keep the gamers engaged. Also, components such as Visual Studio, Mixer, DirectX, Azure App Center, Visual Studio, Visual Studio Code, and Havok are all a part of Game Stack. For more information, check out the official Microsoft Game Stack blog post. Microsoft open sources the Windows Calculator code on GitHub Microsoft open sources ‘Accessibility Insights for Web’, a chrome extension to help web developers fix their accessibility issue Microsoft researchers introduce a new climate forecasting model and a public dataset to train these models
Read more
  • 0
  • 0
  • 19357

article-image-unite-berlin-2018-keynote-unity-partners-with-google-launches-ml-agents-toolkit-0-4-project-mars-and-more
Sugandha Lahoti
20 Jun 2018
5 min read
Save for later

Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more

Sugandha Lahoti
20 Jun 2018
5 min read
Unite Berlin 2018, the Unity annual developer conference, kicked off on June 19’ 2018. This three-day extravaganza will take you through a thrilling ride filled with new announcements, sessions, and workshops from the amazing creators of Unity. It’s a place to develop, network, and participate with artists, developers, filmmakers, researchers, storytellers and other creators. Day 1 was inaugurated with the promising Keynote, presented by John Riccitiello, CEO of Unity Technologies. It featured previews of upcoming unity technology, most prominently Unity’s alliance with Google Cloud to help developers build connected games. Let’s take a look at what was showcased. Connected Games with Unity and Google Cloud Unity and Google Cloud have collaborated for helping developers create real-time multiplayer games. They are building a suite of managed services and tools to help developers, test, and run connected experiences while offloading the hard work of quickly scaling game servers to Google Cloud. Games can be easily scaled to meet the needs of the players. Game developers can harness the massive power of Google cloud without having to be a cloud expert. Here’s what Google Cloud with Unity has in store: Game-Server Hosting: Streamlined resources to develop and scale hosted multiplayer games. Sample FPS: A production-quality sample project of a real-time multiplayer game. New ECS Networking Layer: Fast, flexible networking code that delivers performant multiplayer by default. Unity ML-Agents Toolkit v0.4 A new version of Unity ML-Agents Toolkit was also announced at Unite Berlin. The v0.4 toolkit hosts multiple updates as requested by the Unity community. Game developers now have the option to train environments directly from the Unity editor, rather than as built executables. Developers can simply launch the learn.py script, and then press the “play” button from within the editor to perform training. They have also launched a set of two new challenging environments, Walker and Pyramids. Walker is physics-based humanoid ragdoll and Pyramids is a complex sparse-reward environment. There are also algorithmic improvements in reinforcement learning. Agents are now trained to learn to solve tasks that were previously learned with great difficulty. Unity is also partnering with Udacity to launch Deep Reinforcement Learning Nanodegree to help students and professionals gain a deeper understanding of reinforcement learning. Augmented Reality with Project MARS Unity has also announced their Project MARS, a Mixed and Augmented Reality studio, that will be provided as a Unity extension. This studio will require almost little-to-no custom coding and will allow game developers to build AR and MR applications that intelligently interact with any real-world environment, with little-to-no custom coding. Unite Berlin - AR Keynote Reel MARS will include abstract layers for object recognition, location, and map data. It will have sample templates with simulated rooms, for testing against different environments, inside the editor.  AR-specific gizmos will be provided to easily define spatial conditions like plane size, elevation, and proximity without requiring code or precise measurements. It will also have elements such as face masks, to avatars, to entire rooms of digital art. Project MARS will be coming to Unity as an experimental package later this year. Unity has also unveiled a Facial AR Remote Component. Powered by Augmented Reality, this component can perform and capture animated characters, allowing filmmakers and CGI developers to shoot CG content with body movement, just like you would with live action. Kinematica - Machine Learning powered Animation system Unity also showcased their AI research by announcing Kinematica, an all-new ML-powered animation system. Kinematica overpowers traditional animation systems which generally require animators to explicitly define transitions. Kinematica does not have any superimposed structure, like graphs or blend trees. It generates smooth transitions and movements by applying machine learning to any data source. Game developers and animators no longer need to manually map out animation graphs. Unite Berlin 2018 - Kinematica Demo Kinematica decides in real time how to combine data clips from a single library into a sequence that matches the controller input, the environment content, and the gameplay requests. As with Project MARS, Kinematica will also be available later this year as an experimental package. New Prefab workflows The entire Prefab systems have been revamped with multiple improvements. This improved Prefab workflow is now available as a preview build. New additions include Prefab Mode, prefab variance, and nested prefabs. Prefab Mode allows faster, efficient, and safer editing of Prefabs in an isolated mode, without adding them to the actual scene. Developers can now edit the model prefabs, and the changes are propagated to all prefab variants. With Nested prefabs, teams can work on different parts of the prefab and then come together for the final asset. Predictive Personalized Placements Personalized placements bring the best of both worlds for players and the commercial business. With this new feature, game developers can create tailor-made game experiences for each player. This feature runs on an engine which is powered by predictive analytics. This prediction engine determines what to show to each player based on what will drive the highest engagement and lifetime value. This includes ad, an IAP promotion, a notification of a new feature, or a cross-promotion. And the algorithm will only get better with time. These were only a select few of the announcements presented in Unity Berlin Keynote. You can watch the full video on YouTube. Details on other sessions, seminars, and activities are available on the Unite website. GitHub for Unity 1.0 is here with Git LFS and file locking support Unity announces a new automotive division and two-day Unity AutoTech Summit Put your game face on! Unity 2018.1 is now available
Read more
  • 0
  • 0
  • 19211

article-image-riot-games-is-struggling-with-sexism-and-lack-of-diversity-employees-plan-to-walkout-in-protest
Sugandha Lahoti
30 Apr 2019
7 min read
Save for later

Riot games is struggling with sexism and lack of diversity; employees plan to walkout in protest

Sugandha Lahoti
30 Apr 2019
7 min read
Update 23rd August 2019: Riot Games has finally settled a class-action lawsuit filed by Riot workers on grounds of sexual harassment and discrimination faced at their workspace. "This is a very strong settlement agreement that provides meaningful and fair value to class members for their experiences at Riot Games," said Ryan Saba of Rosen Saba, LLP, the attorney representing the plaintiffs. "This is a clear indication that Riot is dedicated to making progress in evolving its culture and employment practices. A number of significant changes to the corporate culture have been made, including increased transparency and industry-leading diversity and inclusion programs. The many Riot employees who spoke up, including the plaintiffs, significantly helped to change the culture at Riot." "We are grateful for every Rioter who has come forward with their concerns and believe this resolution is fair for everyone involved," said Nicolo Laurent, CEO of Riot Games. "With this agreement, we are honoring our commitment to find the best and most expeditious way for all Rioters, and Riot, to move forward and heal." Update as on 6 May, 2019: Riot Games announced early Friday that they will soon start giving new employees the option to opt-out of some mandatory arbitration requirements when they are hired. The catch - The arbitration will initially narrowly focused on a specific set of employees for a specific set of causes. Riot games employees are planning to walkout in protest of the company’s sexist culture and lack of diversity. Riot has been in the spotlight since Kotaku published a detailed report highlighting how five current and former Riot employees filed lawsuits against the company citing the sexist culture that fosters in Riot. Out of the five employees, two were women. Per Kotaku, last Thursday, Riot filed a motion to force two of those women, whose lawsuits revolved around the California Equal Pay Act, into private arbitration. In their motions, Riot’s lawyer argues that these employees waived their rights to a jury trial when they signed arbitration agreements upon their hiring. Private arbitration makes these employees less likely to win against Riot. In November last year, 20,000 Google employees along with Temps, Vendors, and Contractors walked out to protest the discrimination, racism, and sexual harassment encountered at Google’s workplace. This Walkout lead to Google ending forced arbitration for its full-time employees. Google employees are also organizing a phone drive, in a letter published on Medium, to press lawmakers to legally end forced arbitration. Per the Verge, “The employees are organizing a phone bank for May 1st and asking for people to make three calls to lawmakers — two to the caller’s senators and one to their representative — pushing for the FAIR Act, which was recently reintroduced in the House of Representatives.” https://twitter.com/endforcedarb/status/1122864987243012097 Following Google, Facebook also made changes to its forced arbitration policy for sexual harassment claims Not only sexual harassment, game developers also undergo unfair treatment in terms of work conditions, job instability, and inadequate pay. In February, The American Federation of Labor and Congress of Industrial Organizations (AFL-CIO), published an open letter on Kotaku. The letter urges the video game industry workers to unionize and voice their support for better treatment within the workplace. Following this motion, Riot employees have organized to walkout in protest demanding Rio leadership to end force arbitration against the two current employees. This walkout is planned for Monday, May 6. An internal document from Riot employees as seen by Kotaku describes the demands laid out by walkout organizers a clear intention to end forced arbitration, a precise deadline (within 6 months) by which to end it a commitment to not force arbitration on the women involved in the ongoing litigations against Riot. Riot’s sexist culture and lack of diversity The investigation conducted by Kotaku last year unveiled some major flaws in Riot’s culture and in gaming companies, in general. Over 28 current and former Riot employees, spoke to Kotaku with stories that echoed of Riot’s female employees being treated unfairly and being on the receiving end of gender discrimination. An employee named Lucy told Kotaku that on thinking of hiring a woman in the leadership role she heard plenty of excuses for why her female job candidates weren’t Riot material. Some were “ladder climbers.” Others had “too much ego.” Most weren’t “gamer enough.” A few were “too punchy,” or didn’t “challenge convention”, she told Kotaku. She also shared her personal experiences facing discrimination. Often her manager would imply that her position was a direct result of her appearance. Every few months, she said, a male boss of hers would comment in public meetings about how her kids and husband must really miss her while she was at work. Women are often told they don’t fit in the company’s ‘bro culture’; an astonishing eighty percent of Riot employees are men, according to data Riot collected from employees’ driver’s licenses. “The ‘bro culture’ there is so real,” said one female source to Kotaku, who said she’d left the company due to sexism. “It’s agonizingly real. It’s like working at a giant fraternity.” Among other people Kotaku interviewed, stories were told on how women were being groomed for promotions, and doing jobs above their title and pay grade, until men were suddenly brought in to replace them. Another women told Kotaku, “how a colleague once informed her, apparently as a compliment, that she was on a list getting passed around by senior leaders detailing who they’d sleep with.” Two former employees also added that they “felt pressure to leave after making their concerns about gender discrimination known.” Many former Riot employees also refused to come forward to share their stories and refrained from participating in the walkout. For some, this was in fear of retaliation from Riot’s fanbase; Riot is the creator of the popular game League of Legends. Others told that they were restricted from talking on the record because of non-disparagement agreements they signed before leaving the company. The walkout threat spread far enough that it prompted a response from Riot’s chief diversity officer, Angela Roseboro, in the company’s private Slack over the weekend reports Waypoint. In a copy of the message obtained by Waypoint, Roseboro says“ We’re also aware there may be an upcoming walkout and recognize some Rioters are not feeling heard. We want to open up a dialogue on Monday and invite Rioters to join us for small group sessions where we can talk through your concerns, and provide as much context as we can about where we’ve landed and why. If you’re interested, please take a moment to add your name to this spreadsheet. We’re planning to keep these sessions smaller so we can have a more candid dialogue.” Riot CEO Nicolo Laurent also acknowledged the talk of a walkout in a statement "We’re proud of our colleagues for standing up for what they believe in. We always want Rioters to have the opportunity to be heard, so we’re sitting down today with Rioters to listen to their opinions and learn more about their perspectives on arbitration. We will also be discussing this topic during our biweekly all-company town hall on Thursday. Both are important forums for us to discuss our current policy and listen to Rioter feedback, which are both important parts of evaluating all of our procedures and policies, including those related to arbitration." Tech worker union, Game workers unite, Googlers for ending forced arbitration have stood up in solidarity with Riot employees. “Forced arbitration clauses are designed to silence workers and minimize the options available to people hurt by these large corporations” https://twitter.com/GameWorkers/status/1122933899590557697 https://twitter.com/endforcedarb/status/1123005582808682497 “Employees at Riot Games are considering a walkout, and the organization efforts has prompted an internal response from company executives", tweeted Coworker.org https://twitter.com/teamcoworker/status/1122936953698160640 Others have also joined in support. https://twitter.com/theminolaur/status/1122931099057950720 https://twitter.com/LuchaLibris/status/1122929166037471233 https://twitter.com/floofyscorp/status/1122955992268967937 #NotOkGoogle: Employee-led town hall reveals hundreds of stories of retaliation at Google DataCamp reckons with its #MeToo movement; CEO steps down from his role indefinitely Microsoft’s #MeToo reckoning: female employees speak out against workplace harassment and discrimination
Read more
  • 0
  • 0
  • 18978

article-image-put-your-game-face-on-unity-2018-1-is-now-available
Sugandha Lahoti
07 May 2018
4 min read
Save for later

Put your game face on! Unity 2018.1 is now available

Sugandha Lahoti
07 May 2018
4 min read
Unity Technologies has announced the release of their latest platform update Unity 2018.1 giving artists, developers and game engineers the power to express their talents and collaborate more efficiently to build games. Unity 2018.1 also marks the start of a new release cycle. Since 2017, Unity has adopted a new release plan where they come up with a new version every quarter and Unity 2018.1 marks the first version of the 2018 series. According to Brett Bibby, VP of Engineering, Unity Technologies, “With Unity 2018.1 we are introducing one of the largest upgrades in the history of our company, and it’s centered around two major concepts - next-level rendering and performance by default,” This release features two new upgrades: the Scriptable Render Pipelines and the Entity Component System. Together they make it easier for creators to make richer experiences utilizing modern hardware to deliver beautiful graphics. Next-level rendering with Scriptable Render Pipeline (SRP) Scriptable Render Pipeline (SRP) is available in the preview of Unity 2018.1. With SRP, developers and technical artists can now work directly with hardware and GPUs without having to go through millions of lines of C++ engine code. SRP makes it easy to customize the rendering pipeline via C# code and material shaders.   Unity 2018.1 also introduces two render pipelines. The High-Definition Render Pipeline (HD RP) is for developers with AAA aspirations. The Lightweight Render Pipeline (LW RP) is for those looking for a combination of graphics and speed. It optimizes the battery life for mobile devices and other similar platforms. Performance by default with the C# Job System &  Entity Component System (ECS) The C# Job system enables developers to write very fast, parallelized code in C# to take full advantage of multicore processors. It also provides protection from the pitfalls of multi-threading, such as race conditions and deadlocks. The runtime system is now combined with a new programming model, the Entity Component System. This new runtime system enables developers to use multicore processors without worrying about the programming. They can use this power to add more effects and complexity to games or add AI to make their creations richer and more immersive. It uses a data-oriented design instead of an object-oriented approach which makes it easier to reuse the code and easier for others to understand and work on it as well. Level design and shaders Unity 2018.1 reduces the time and effort required by artists, designers, and developers by allowing them to create levels, cinematic content, and gameplay sequences without coding. For this, new tools like ProBuilder/Polybrush and the new visual Shader Graph offer intuitive ways to design levels and create shaders without programming skills. ProBuilder is a unique hybrid of 3D-modeling and level-design tools optimized for building simple geometry, but capable of detailed editing and UV unwrapping as needed. With Polybrush developers can blend textures and colors, sculpt meshes and scatter objects directly in the Unity editor. Shader Graph can build shaders visually using a designer tool — without writing a single line of code. They offer easy drag-and-drop usability to create and connect nodes in a graph network. Unity Package Manager UI Unity 2018.1 builds on the package manager introduced in Unity 2017.2. It has a newly released Package Manager User Interface, the Hub, and Project Templates, to help start new projects faster and more efficiently. The Unity Package Manager UI  improves the following aspects of the project management workflow: Quick access to newly released features Get the latest fixes, instantly Access to Preview features Easily share lightweight projects Unity 2018.1 offers support for over 25+ platforms. This includes Magic Leap One, Oculus Go, ARCore 1.1, Android ARM64, Daydream Standalone and more. You can refer to the release notes for the full list of new features, improvements, and fixes. Unity will be showcasing all their latest innovations during Unite Berlin scheduled on June 19 - 21, 2018. Unity plugins for augmented reality application development Game Engine Wars: Unity vs Unreal Engine Unity releases ML-Agents v0.3: Imitation Learning, Memory-Enhanced Agents and more  
Read more
  • 0
  • 0
  • 18845
article-image-epic-games-at-gdc-announces-epic-megagrants-rtx-powered-ray-tracing-demo-and-free-online-services-for-game-developers
Natasha Mathur
22 Mar 2019
4 min read
Save for later

Epic Games announces: Epic MegaGrants, RTX-powered Ray tracing demo, and free online services for game developers

Natasha Mathur
22 Mar 2019
4 min read
Epic Games, an American video game and software development company, made a series of announcements, earlier this week. These include: Epic Game’s CEO, Tim Sweeney to offer $100 million in grants to game developers Stunning RTX-powered Ray-Tracing Demo named Troll Epic’s free Online Services launch for game developers Epic MegaGrants: $100 million funds to Game Developers Tim Sweeney, CEO, Epic Games Inc, announced earlier this week that he will be offering $100 million in grants to game developers to boost the growth of the gaming industry. Sweeney made the announcement during a presentation on Wednesday at the Game Developers Conference (GDC). GDC is the world's largest professional game industry event that ended yesterday in San Francisco. Epic Games also created a $5 million fund for grants that have been disbursed over the last three years. Now Epic Games is off to build a new fund called Epic MegaGrants. These are “no-strings-attached” grants, meaning that they don’t consist of any contracts requiring game developers to do anything for Epic. All that game developers need to do is apply for the grants, create an innovative project, and if the Epic’s judges find it worthy, they’ll offer them the funds. “There are no commercial hooks back to Epic. You don’t have to commit to any deliverables. This is our way of sharing Fortnite’s unbelievable success with as many developers as we can”, said Sweeney. Troll: a Ray Tracing Unreal Engine 4 Demo Another eye-grabbing moment at GDC this year was a “visually stunning” ray tracing demo revealed by Goodbye Kansas and Deep Forest Films called "Troll”. Troll was rendered in real time using Unreal Engine 4.22 ray tracing and camera effects. And powered by a NVIDIA’s single GeForce RTX 2080 Ti graphics card.  Troll is visually inspired by Swedish painter and illustrator John Bauer, whose illustrations are famous for Swedish folklore and fairy tales anthology known as ‘Among Gnomes and Trolls’. https://www.youtube.com/watch?v=Qjt_MqEOcGM                                                            Troll “Ray tracing is more than just reflections — it’s about all the subtle lighting interactions needed to create a natural, beautiful image. Ray tracing adds these subtle lighting effects throughout the scene, making everything look more real and natural,” said Nick Penwarden, Director of Engineering for Unreal Engine at Epic Games. NVIDIA team states in a blog post that Epic Games has been working to integrate RTX-accelerated ray tracing into its popular Unreal Engine 4. In fact, Unreal Engine 4.22 will have the support for new Microsoft DXR API for real-time ray tracing. Epic’s free online services launch for game developers Epic Games also announced the launch of free tools and services, part of the Epic Online Services, which was announced in December 2018. The SDK is available via the new developer portal for immediate download and use. SDK currently supports Windows, Mac, and Linux. Moreover, the SDK, as a part of the release, provides support for two free services, namely, game analytics and player ticketing. Game analytics help developers understand player behavior. It features DAU (Daily active users), MAU (Monthly active users), retention, new player counts, game launch counts, online user count, and more. The ticketing system connects players directly with developers and allows them to report bugs or other problems. These two services will continue to evolve along with the rest of Epic Online Services (EOS) to offer infrastructure and tools required by the developers to launch, operate, and scale the high-quality online games. Epic games will also be offering additional free services throughout 2019, including player data storage, player reports, leaderboards & stats, player identity, player inventory, matchmaking etc. “We are committed to developing EOS with features that can be used with any engine, any store and that can support any major platform...these services will allow developers to deliver cross-platform gameplay experiences that enable players to enjoy games no matter what platform they play on”, states the Epic Games team. Fortnite server suffered a minor outage, Epic Games was quick to address the issue Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android installer Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned
Read more
  • 0
  • 0
  • 18742

article-image-us-labor-organization-afl-cio-writes-an-open-letter-to-game-developers-urging-them-to-unionize-for-fair-treatment-at-work
Natasha Mathur
18 Feb 2019
3 min read
Save for later

US Labor organization, AFL-CIO writes an open letter to game developers, urging them to unionize for fair treatment at work

Natasha Mathur
18 Feb 2019
3 min read
The American Federation of Labor and Congress of Industrial Organizations (AFL-CIO), the largest labour organization in the United States, published an open letter on Kotaku, a video game website and blog, last week. The letter urges the video game industry workers to unionize and voice their support for better treatment within the workplace. The letter is from secretary-treasurer Liz Shuler and this is the first time when AFL-CIO has made a public statement about unionizing game developers. Shuler talks about the struggles of game developers and the unfair treatment that they go through in terms of work conditions, job instability, and inadequate pay in the letter.  Shuler mentions that although U.S. video game sales reached $43 billion in 2018 ( which is 3.6 times larger than the film industry’s record-breaking box office) and is a “stunning accomplishment” for the game developers, they are still not getting the respect that they deserve.   “You’ve built new worlds, designed new challenges and ushered in a new era of entertainment. Now it’s time for industry bosses to start treating you with hard-earned dignity and respect”, writes Shuler. She mentions that game developers often work for outrageous hours in a stressful and toxic work condition, unable to ask for better due to the fear of losing their jobs. She gives an example of developers at Rockstar Games who shared their experiences of  “crunch time” (when the pressure to succeed is extreme) lasting months and sometimes even years to meet the unreal demands from management and deliver a game that made their bosses earn $725 million in its first three days. “They get rich. They get notoriety. They get to be crowned visionaries and regarded as pioneers. What do you get?”, writes Shuler. According to Shuler, this is a moment for change and change will come when developers come together as a strong union by using their “collective voice” to ask for a “fair share of wealth” that the game developers create every day. She writes that the CEOs and the bosses would treat the developers right only when they stand together and demand it. “You have the power to demand a stake in your industry and a say in your economic future. Whether we’re mainlining caffeine in Santa Monica, clearing tables in Chicago or mining coal in West Virginia, we deserve to collect nothing less than the full value of our work”, states Shuler. Public reaction to the news is mostly positive, with some people calling out for a better and stronger alternative than unions: https://twitter.com/kwertzy/status/1096471380357349376 https://twitter.com/getglitched/status/1096499209719685120 https://twitter.com/moesidegaming/status/1096666233011871744 https://twitter.com/legend500/status/1096571646805188608 https://twitter.com/turnageb/status/1096481116763107328 Check out the complete letter here. Open letter from Mozilla Foundation and other companies to Facebook urging transparency in political ads Google TVCs write an open letter to Google’s CEO; demands for equal benefits and treatment The cruelty of algorithms: Heartbreaking open letter criticizes tech companies for showing baby ads after stillbirth
Read more
  • 0
  • 0
  • 18658

article-image-valve-reveals-new-index-vr-kit-with-detail-specs-and-costs-upto-999
Fatema Patrawala
02 May 2019
4 min read
Save for later

Valve reveals new Index VR Kit with detail specs and costs upto $999

Fatema Patrawala
02 May 2019
4 min read
Valve introduced the new VR headset kit, Valve Index, only one month ago. And said the preorders will begin from, May 1st, and will ship in June. Today, Valve is fully detailing the Index headset for the first time, and revealing exactly how much it will cost: $999. The price seems to be relatively high according to today’s VR headset standards. In comparison, Facebook announced the Oculus Quest and Oculus Rift S to be shipped on May 21st for $399. But Valve says it will let you buy parts piecemeal if you need, which is good deal if you do not wish to buy the whole kit. And if you’ve already got a Vive or Vive Pro and / or don’t need the latest Knuckles controllers, you won’t necessarily need to spend that whole $999 to get started. Get the best look yet of the Index headset at the Valve Index website. Like the HTC Vive, which was co-designed with Valve, the Index will still be a tethered experience with a 5-meter cable that plugs into a gaming PC. It also uses the company’s laser-firing Lighthouse base stations to figure out where the headset is at any given time. That’s how it lets you walk around a room worth of space in VR — up to a huge 10 x 10 meter room. Valve’s not using cameras for inside-out tracking; the company says the twin stereo RGB cameras here are designed for passthrough (letting you see the real world through the headset) and for the computer vision community. Instead, Valve says the Index’s focus is on delivering the highest fidelity VR experience possible, meaning improved lenses, screens, and audio. In this case it actually includes a pair of 1440 x 1600-resolution RGB LCDs, rather than the higher-res OLED screens much of which the competition is already using. But Valve says its screens run faster — 120Hz, with an experimental 144Hz mode — and are better at combating the “screen door effect” and blurry when you move your head, persistence issues that first-gen VR headsets struggled with. The Valve Index also has an IPD slider to adjust for the distance between your eyes and lenses that Valve says offer a 20-degree larger field of view than the HTC Vive “for typical users.” Most interesting in Valve are the built-in headphone images shown on the website which aren’t actually headphones — but they’re speakers. And they are designed to not touch your ears, instead firing their sound toward your head. It is similar to how Microsoft’s HoloLens visors produce audio, which means that while people around you could theoretically hear what you’re doing, there’ll be less fiddling with the mechanism to get that audio aligned with your ears. They have also provided a 3.5mm headphone jack if you want to plug in your own headphones. Another interesting part of the Valve Index is it can be purchased separately for $279. The Valve Index Controllers, formerly known as Knuckles, might be the most intuitive way to get your hands into VR yet. While a strap holds the controller to your hand, 87 sensors track the position of your hands and fingers and even how hard you’re pressing down. Theoretically, you could easily reach, grab, and throw virtual objects with such a setup, something that wasn’t really possible with the HTC Vive or Oculus Touch controllers. Here’s one gameplay example that Valve is showing off: Source - Valve website Another small improvement is the company’s Lighthouse base stations. Since they only use a single laser now, and no IR blinker, Valve says they play nicer with other IR devices, which mean you can turn on and off TV without needing to power off them first. According to the reports by Polygon which got an early hands-on with the Valve Index, they say the Knuckles feel great, the optics are sharp, and that it may be the most comfortable way to wear a VR headset over a pair of glasses yet. Polygon also further explained the $999 price point. They said, during Valve’s demonstration, a spokesperson said that Index is the sort of thing that is likely to appeal to a virtual reality enthusiast who (a) must have the latest thing and (b) enjoys sufficient disposable income to satisfy that desire. It’s an interesting contrast with Facebook’s strategy for Rift, which is pushing hard for the price tipping point when VR suddenly becomes a mass-market thing, like smartphones did a decade ago. Get to know about pricing details of Valve Index kit on its official page. Top 7 tools for virtual reality game developers Game developers say Virtual Reality is here to stay Facebook releases DeepFocus, an AI-powered rendering system to make virtual reality more real
Read more
  • 0
  • 0
  • 18381
article-image-unity-ml-agents-toolkit-v0-6-gets-two-updates-improved-usability-of-brains-and-workflow-for-imitation-learning
Sugandha Lahoti
19 Dec 2018
2 min read
Save for later

Unity ML-Agents Toolkit v0.6 gets two updates: improved usability of Brains and workflow for Imitation Learning

Sugandha Lahoti
19 Dec 2018
2 min read
Unity ML-agents toolkit v0.6 is getting two major enhancements, announced the Unity team in a blog post on Monday. The first update turns Brains from MonoBehaviors to ScriptableObjects improving their usability. The second update allows developers to record expert demonstrations and use them for offline training, providing a better user workflow for Imitation Learning. Brains are now ScriptableObjects Brains were GameObjects that were attached as children to the Academy GameObject in previous versions of ML-Agents Toolkit. This made it difficult to re-use Brains across Unity scenes within the same project. In the v0.6 release, Brains are Scriptable objects, making them manageable as standard Unity assets. This makes it easy to use them across scenes and to create Agents’ Prefabs with Brains pre-attached. The Unity team has come up with the Learning Brain Scriptable Object that replaces the previous Internal and External Brains. It has also introduced Player and Heuristic Brain Scriptable Objects to replace the Player and Heuristic Brain Types, respectively. Developers can no longer change the type of Brain with the Brain Type drop down and need to create a different Brain for Player and Learning from the Assets menu. The BroadcastHub in the Academy Component can keep a track of which Brains are being trained. Record expert demonstrations for offline training The Demonstration Recorder allows users to record the actions and observations of an Agent while playing a game. These recordings can be used to train Agents at a later time via Imitation Learning or to analyze the data. Basically, Demonstration recorder helps training data for multiple training sessions, rather than capturing it every time. Users can add the Demonstration Recorder component to their Agent, check Record and give the demonstration a name. To train an Agent with the recording, users can modify the Hyperparameters in the training configuration. Check out the documentation on Github for more information. Read more about the new enhancements on Unity Blog. Getting started with ML agents in Unity [Tutorial] Unity releases ML-Agents toolkit v0.5 with Gym interface, a new suite of learning environments Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more
Read more
  • 0
  • 0
  • 18212

article-image-unity-releases-ml-agents-toolkit-v0-5-with-gym-interface-a-new-suite-of-learning-environments
Sugandha Lahoti
12 Sep 2018
2 min read
Save for later

Unity releases ML-Agents toolkit v0.5 with Gym interface, a new suite of learning environments

Sugandha Lahoti
12 Sep 2018
2 min read
In their commitment to become the go-to platform for Artificial Intelligence, Unity has released a new version of their ML-Agents Toolkit.  ML-Agents toolkit v0.5 comes with more flexible action specification, a Gym interface for researchers to more easily integrate ML-Agents environments into their training workflows, and a new suite of learning environments replicating some of the Continuous Control benchmarks used in Deep Reinforcement Learning. They have also released a research paper on ML-Agents which the Unity platform has titled “Unity: A General Platform for Intelligent Agent.” Changes to the ML-Agents toolkit v0.5 A lot of changes have been made pertaining to ML-Agents toolkit v0.5. Highlighted changes to repository structure The python folder has been renamed ml-agents. It now contains a python package called mlagents. The unity-environment folder, containing the Unity project, has been renamed UnitySDK. The protobuf definitions used for communication have been added to a new protobuf-definitions folder. Example curricula and the trainer configuration file have been moved to a new config sub-directory. New features New package gym-unity which provides gym interface to wrap UnityEnvironment. The ML-Agents toolkit v0.5 can now run multiple concurrent training sessions with the --num-runs=<n> command line option. Added Meta-Curriculum which supports curriculum learning in multi-brain environments. Action Masking for Discrete Control which makes it possible to mask invalid actions each step to limit the actions an agent can take. Fixes & Performance Improvements Replaced some activation functions to swish. Visual Observations use PNG instead of JPEG to avoid compression losses. Improved python unit tests. Multiple training sessions are available on single GPU. Curriculum lessons are now tracked correctly. Developers can now visualize value estimates when using models trained with PPO from Unity with GetValueEstimate(). It is now possible to specify which camera the Monitor displays to. Console summaries will now be displayed even when running inference mode from python. Minimum supported Unity version is now 2017.4. You can read all about the new version of ML-Agents Toolkit on the Unity Blog. Unity releases ML-Agents v0.3: Imitation Learning, Memory-Enhanced Agents and more. Unity Machine Learning Agents: Transforming Games with Artificial Intelligence. Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more.
Read more
  • 0
  • 0
  • 18185
Modal Close icon
Modal Close icon