Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - 3D Game Development

56 Articles
article-image-google-announces-early-access-of-game-builder-a-platform-for-building-3d-games-with-zero-coding
Bhagyashree R
17 Jun 2019
3 min read
Save for later

Google announces early access of ‘Game Builder’, a platform for building 3D games with zero coding

Bhagyashree R
17 Jun 2019
3 min read
Last week, a team within Area 120, Google’s workshop for experimental products, introduced an experimental prototype of Game Builder. It is a “game building sandbox” that enables you to build and play 3D games in just a few minutes. It is currently in early access and is available on Steam. https://twitter.com/artofsully/status/1139230946492682240 Here’s how Game Builder makes “building a game feel like playing a game”: Source: Google Following are some of the features that Game Builder comes with: Everything is multiplayer Game Builder’s always-on multiplayer feature allows multiple users to build and play games simultaneously. Your friends can also play the game while you are working on it. Thousands of 3D models from Google Poly You can find thousands of free 3D models (such as rocket ship, synthesizer, ice cream cone) to use in your games from Google Poly. You can also “remix” most of the models using Tilt Brush and Google Blocks application integration to make it fit for your game. Once you find the right 3D model, you can easily and instantly use it in your game. No code, no compilation required This platform is designed for all skill levels, from enabling players to build their first game to providing game developers a faster way to realize their game ideas. Game Builder’s card-based visual programming allows you to bring your game to life with bare minimum knowledge of programming. You just need to drag and drop cards to answer questions like  “How do I move?.” You can also create your own cards with Game Builder’s extensive JavaScript API. It allows you to script almost everything in the game. As the code is live, you just need to save the changes and you are ready to play the game without any compilation. Apart from these features, you can also create levels with terrain blocks, edit the physics of objects, create lighting and particle effects, and more. Once the game is ready you can share your creations on Steam Workshop. Many people are commending this easy way of game building, but also think that this is nothing new. We have seen such platforms in the past, for instance, GameMaker by YoYo Games. “I just had a play with it. It seems very well thought out. It has a very nice tutorial that introduces all the basic concepts. I am looking forward to trying out the multiplayer aspect, as that seems to be the most compelling thing about it,”  a Hacker News user commented. You can read Google’s official announcement for more details. Google Research Football Environment: A Reinforcement Learning environment for AI agents to master football Google Walkout organizer, Claire Stapleton resigns after facing retaliation from management Ian Lance Taylor, Golang team member, adds another perspective to Go being Google’s language
Read more
  • 0
  • 0
  • 22367

article-image-introducing-minecraft-earth-minecrafts-ar-based-game-for-android-and-ios-users
Amrata Joshi
20 May 2019
4 min read
Save for later

Introducing Minecraft Earth, Minecraft's AR-based game for Android and iOS users

Amrata Joshi
20 May 2019
4 min read
Last week, the team at Minecraft introduced a new AR-based game called ‘Minecraft Earth’, which is free for Android and iOS users. The most striking feature about Minecraft Earth is that it builds on the real world with augmented reality, I am sure it will remind you of the game Pokémon Go. https://twitter.com/minecraftearth/status/1129372933565108224 Minecraft has around 91 million active players, and now Microsoft is looking forward to taking the Pokémon Go concept on the next level by letting Minecraft players create and share whatever they’ve made in the game with friends in the real world. Users can now build something in Minecraft on their phones and then drop it into their local park for all their friends to see it together at the same location. This game aims to transform single user AR gaming to multi-user gaming while letting users access the virtual world that’s shared by everyone. Read Also: Facebook launched new multiplayer AR games in Messenger Minecraft Earth will be available in beta on iOS and Android, this summer. This game brings modes like creative that has unlimited blocks and items; or survival where you lose all your items when you die. Torfi Olafsson, game director of Minecraft Earth, explains, “This is an adaptation, this is not a direct translation of Minecraft. While it’s an adaptation, it’s built on the existing Bedrock engine so it will be very familiar to existing Minecraft players. If you like building Redstone machines, or you’re used to how the water flows, or how sand falls down, it all works. Olafsson further added, “All of the mobs of animals and creatures in Minecraft are available, too, including a new pig that really loves mud. We have tried to stay very true to the kind of core design pillars of Minecraft, and we’ve worked with the design team in Stockholm to make sure that the spirit of the game is carried through.” Players have to venture out into the real world to collect things just like how it works in Pokemon Go! Minecraft Earth has something similar to pokéstops called “tapables”, which are randomly placed in the world around the player. They are designed to give players rewards that allow them to build things, and players need to collect as many of these as possible in order to get resources and items to build vast structures in the building mode. The maps in this game are based on OpenStreetMap that has allowed Microsoft to place Minecraft adventures into the world. On the Minecraft Earth map, these adventures spawn dynamically and are also designed for multiple people to get involved in. Players can play together while sitting side by side to experience similar adventures at the exact same time and spot. They can also fight monsters, break down structures for resources together, and even stand in front of a friend to block them from physically killing a virtual sheep. Players can even see the tools that fellow players have in their hands on your phone’s screen, alongside their username. Microsoft is also using its Azure Spatial Anchors technology in Minecraft Earth which uses machine vision algorithms so that real-world objects can be used as anchors for digital content. Niantic, a Pokémon Go developer had to recently settle a lawsuit with angry homeowners who had pokéstops placed near their houses. With what happened with Pokemon Go in the past could be a threat for games like Minecraft Earth too. As there are many challenges in bringing augmented reality within private spaces. Saxs Persson, creative director of Minecraft said, “There are lots of very real challenges around user-generated content. It’s a complicated problem at the scale we’re talking about, but that doesn’t mean we shouldn’t tackle it.” https://twitter.com/Toadsanime/status/1129374278384795649 https://twitter.com/ExpnandBanana/status/1129419087216562177 https://twitter.com/flamnhotsadness/status/1129429075490160642 https://twitter.com/pixiebIush/status/1129455271833550848 To know more about Minecraft Earth, check out Minecraft’s page. Game rivals, Microsoft and Sony, form a surprising cloud gaming and AI partnership Obstacle Tower Environment 2.0: Unity announces Round 2 of its ‘Obstacle Tower Challenge’ to test AI game players OpenAI Five beats pro Dota 2 players; wins 2-1 against the gamers
Read more
  • 0
  • 0
  • 22189

article-image-unreal-engine-4-20-released-with-focus-on-mobile-and-immersive-ar-vr-mr-devices
Sugandha Lahoti
20 Jul 2018
4 min read
Save for later

Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices

Sugandha Lahoti
20 Jul 2018
4 min read
Following the release of Unreal Engine 4.19 this April, Epic games have launched the Unreal Engine 4.20. This major update focuses on enhancing scalability and creativity, helping developers create more realistic characters, and immersive environments, for games, film, TV, and VR/AR devices. Multiple optimizations for Mobile Game development Epic games brought over 100 optimizations created for Fortnite on iOS and Android, for Unreal Engine 4.20. Hardware Occlusion Queries are now supported for high-end mobile devices on iOS and Android that support ES 3.1 or Vulkan using the GPU. Developers can also iterate and debug on Android without having to repackage the UE4 project. Game developers now have unlimited Landscape Material layers on mobile devices. Mixed Reality Capture Unreal Engine 4.20 provides a new Mixed Reality Capture functionality, which makes it easy to composite real players into a virtual space for mixed reality applications. It has three components: video input, calibration, and in-game compositing. You can use supported webcams and HDMI capture devices to pull real-world green-screened video into the Unreal Engine from a variety of sources.  The setup and calibration are done through a standalone calibration tool that can be reused across Unreal Engine 4 titles. Niagara Visual effects editor The Niagara visual effects Editor is available as an early access plugin. While the Niagara editor builds on the same particle manipulation methods of Cascade (UE4’s previous VFX), unlike Cascade, Niagara is fully Modular. UE 4.20 adds multiple improvements to Niagara Effect Design and Creation. All of Niagara’s Modules have been updated to support commonly used behaviors in building effects for games. New UI features have also been added for the Niagara stack that mimic the options developers have with UProperties in C++. Niagara now has support for GPU Simulation when used on DX11, PS4, Xbox One, OpenGL (ES3.1), and Metal platforms. Niagara CPU Simulation now works on PC, PS4, Xbox One, OpenGL (ES3.1) and Metal. Niagara was showcased at the GDC 2018 and you can see the presentation Programmable VFX with Unreal Engine’s Niagara for a complete overview. Cinematic Depth of Field Unreal Engine 4.20 also adds Cinematic Depth of Field, where developers can achieve cinema quality camera effects in real-time. Cinematic DoF, provides cleaner depth of field effect providing a cinematic appearance with the use of a procedural Bokeh simulation. It also features dynamic resolution stability, supports alpha channel, and includes settings to scale it down for console projects. For additional information, you can see the Depth of Field documentation. Proxy LOD improvements The Proxy LOD tool is now production-ready. This tool improves performance by reducing rendering cost due to poly count, draw calls, and material complexity. It  results in significant gains when developing for mobile and console platforms. The production-ready version of the Proxy LOD tool has several enhancements over the Experimental version found in UE4.19. Improved Normal Control: The use may now supply the hard-edge cutoff angle and the method used in computing the vertex normal. Gap Filling: The Proxy system automatically discards any inaccessible structures. Gap Filling results in fewer total triangles and a better use of the limited texture resource. Magic Leap One Early Access Support With Unreal Engine 4.20, game developers can now build for Magic Leap One. Unreal Engine 4 support for Magic Leap One uses built-in UE4 frameworks such as camera control, world meshing, motion controllers, and forward and deferred rendering. For developers with access to hardware, Unreal Engine 4.20 can deploy and run on the device in addition to supporting Zero Iteration workflows through Play In Editor. Read more The hype behind Magic Leap’s New Augmented Reality Headsets Magic Leap’s first AR headset, powered by Nvidia Tegra X2, is coming this Summer Apple ARKit 2.0 and Google ARCore 1.2 Support Unreal Engine 4.20 adds support for Apple’s ARKit 2.0, for better tracking quality, support for vertical plane detection, face tracking, 2D and 3D image detection, and persistent and shared AR experiences. It also adds support for Google’s ARCore 1.2, including vertical plane detection, Augmented Images, and Cloud Anchor to build collaborative AR experiences. These are just a select few updates to the Unreal Engine. The full list of release notes is available on the Unreal Engine blog. What’s new in Unreal Engine 4.19? Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 1
  • 21236

article-image-meet-yuzu-an-experimental-emulator-for-the-nintendo-switch
Sugandha Lahoti
17 Jul 2018
3 min read
Save for later

Meet yuzu – an experimental emulator for the Nintendo Switch

Sugandha Lahoti
17 Jul 2018
3 min read
The makers of Citra, an emulator for the Nintendo 3DS, have released a new emulator called yuzu. This emulator is made for the Nintendo Switch, which is the 7th major video game console from Nintendo. The journey so far for yuzu Yuzu was initiated as an experimental setup by Citra’s lead developer bunnei after he saw that there were signs of the Switch’s operating system being based on the 3DS’s operating system. yuzu has the same core code as Citra and much of the same OS High-Level Emulation (HLE). The core emulation and memory management of yuzu are based on Citra, albeit modified to work with 64-bit addresses. It also has a loader for the Switch games and Unicorn integration for CPU emulation. Yuzu uses Reverse Engineering process to figure out how games work, and how the Switch GPU works. Switch’s GPU is more advanced than 3DS’ used in Citra and poses multiple challenges to reverse engineer it. However, the RE process of yuzu is essentially the same as Citra. Most of their RE and other development is being done in a trial-and-error manner. OS emulation The Switch’s OS is based Nintendo 3DS’s OS. So the developers used a large part of Citra’s OS HLE code for yuzu OS. The loader and file system service was reused from Citra and modified to support Switch game dump files. The Kernel OS threading, scheduling, and synchronization fixes for yuzu were also ported from Citra’s OS implementation. The save data functionality, which allowed games to read and write files to the save data directory was also taken from 3DS. Switchbrew helped them create libnx, a userland library to write homebrew apps for the Nintendo Switch. (Homebrew is a popular term used for applications that are created and executed on a video game console by hackers, programmers, developers, and consumers.) The Switch IPC (Inter-process communication) process is much more robust and complicated than the 3DS’s. Their system has different command modes, a typical IPC request response, and a Domain to efficiently conduct multiple service calls. Yuzu uses the Nvidia services to configure the video driver to get the graphics output. However, Nintendo re-purposed the Android graphics stack and used it in the Switch for rendering. And so yuzu developers had to implement this even to get homebrew applications to display graphics. The Next Steps Being at a nascent stage, yuzu still has a long way to go. The developers still have to add HID (user input support) such as support for all 9 controllers, rumble, LEDs, layouts etc. Currently, the Audio HLE is in progress, but they still have to implement audio playback. Audio playback, if implemented properly, would be a major breakthrough as most complicated games often hang or go into a deadlock because of this issue. They are also working on resolving minor fixes to help them boot further in games like Super Mario Odyssey, 1-2-Switch, and The Binding of Issac. Be sure to read the entire progress report on the yuzu blog. AI for game developers: 7 ways AI can take your game to the next level AI for Unity game developers: How to emulate real-world senses in your NPC agent behavior Unity 2018.2: Unity release for this year 2nd time in a row!
Read more
  • 0
  • 0
  • 21148

article-image-unreal-engine-4-22-update-support-added-for-microsofts-directx-raytracing-dxr
Melisha Dsouza
15 Feb 2019
3 min read
Save for later

Unreal Engine 4.22 update: support added for Microsoft’s DirectX Raytracing (DXR)

Melisha Dsouza
15 Feb 2019
3 min read
On 12th February, Epic Games released a preview build of Unreal Engine 4.22, and a major upgrade among numerous other features and fixes is the support for real-time ray tracing and path tracing. The new build will extend its preliminary support for Microsoft's DirectX Ray-tracing (DXR) extensions to the DirectX 12 API. Developers can now try their hands at ray-traced games developed through Unreal Engine 4. There are very limited games that support raytracing. Currently, only  Battlefield V (Ray Traced Reflections) and Metro Exodus (Ray Traced Global Illumination) feature ray tracing effects, which are developed in the proprietary Frostbite 3 and 4A Game Engines. [box type="shadow" align="" class="" width=""]Fun Fact: Ray tracing is a much more advanced and lifelike way of rendering light and shadows in a scene. Movies and TV shows use this to create and blend in amazing CG work with real-life scenes leading to more life-like, interactive and immersive game worlds with more realistic lighting, shadows, and materials.[/box] The patch notes released by the team states that they have added low level support for ray tracing: Added ray tracing low-level support. Implemented a low-level layer on top of UE DirectX 12 that provides support for DXR and allows creating and using ray tracing shaders (ray generation shaders, hit shaders, etc) to add ray tracing effects. Added high-level ray tracing features Rect area lights Soft shadows Reflections Reflected shadows Ambient occlusion RTGI (ray traced global illumination) Translucency Clearcoat IBL Sky Geometry types Triangle meshes Static Skeletal (Morph targets & Skin cache) Niagara particles support Texture LOD Denoiser Shadows, Reflections, AO Path Tracert Unbiased, full GI path tracer for making ground truth reference renders inside UE4. According to HardOCP,  the feature isn't technically tied to Nvidia RTX but since turing cards are the only ones with driver support for DirectX Raytracing at the moment, developers need an RTX 2000 series GPU to test out Unreal's Raytracing. There has been much debate about the RTX offered by NVIDIA in the past. While the concept did sound interesting at the beginning, very few engines adopted the idea- simply because previous generation processors cannot support all the features of NVIDIA’s RTX. Now, with DXR in the picture, It will be interesting to see the outcome of games developed using ray tracing. Head over to Unreal Engine’s official post to know more about this news. Implementing an AI in Unreal Engine 4 with AI Perception components [Tutorial] Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 0
  • 20091

article-image-microsoft-announces-game-stack-with-xbox-live-integration-to-android-and-ios
Natasha Mathur
15 Mar 2019
3 min read
Save for later

Microsoft announces Game stack with Xbox Live integration to Android and iOS

Natasha Mathur
15 Mar 2019
3 min read
Microsoft has good news for all the game developers out there. It launched a new initiative, called Microsoft Game Stack yesterday, which includes an amalgamation of different Microsoft tools and services into a single robust ecosystem to ‘empower game developers’. It doesn’t matter whether you’re a rookie indie developer or an AAA studio, this developer-focused platform will make the game development process ten times easier for you. The main goal of Game Stack is to help developers easily find different tools and services required for game development at one spot. These tools range from Azure, PlayFab, DirectX, Visual Studio, to  Xbox Live, App Center, and Havok. Cloud plays a major role in Game Stack and it makes use of Azure to fulfill this requirement. Source: Microsoft Azure is globally available in 54 regions will help scale its Project xCloud (a service that streams games to PCs, consoles, and mobile devices) to provide an uninterrupted gaming experience for players worldwide. Not to forget, companies like Rare, Ubisoft and Wizards of the Coast are already hosting multiplayer game servers and storing their player data on Azure. It is also capable of analyzing game telemetry, protecting games from DDOS attacks, and training AI. Moreover, Microsoft Game Stack is device agnostic which makes it really convenient for the gamers. Another great component of Game Stack is a backend service for operating and building new games, called PlayFab. PlayFab offers game development services, real-time analytics, and LiveOps capabilities to Game Stack. PlayFab is also device agnostic. It supports iOS, Android, PC, Web, Xbox, Sony PlayStation, Nintendo Switch and all the other major game engines such as Unity and Unreal. Microsoft has also released a preview for five new PlayFab services. Out of these five, one, called, PlayFab Matchmaking is open for public preview, while other four including PlayFab Party, PlayFab Insights, PlayFab PubSub, and PlayFab user-generated Content are in private preview. Game Stack also comes with Xbox Live, one of the most engaging and interactive gaming communities in the world. Xbox Live will be providing identity and community services in the Game Stack. Microsoft has also expanded the cross-platform capabilities of Xbox Live under Game Stack with a new SDK for iOS and Android devices. Mobile developers will be able to easily connect with some of the most highly engaged and passionate gamers on the planet using Xbox Live. Other benefits of the Xbox Live SDK includes more focus on building games and leveraging Microsoft‘s trusted identity network to offer support for log-in, privacy, online safety and child accounts. Apart from that, there are features like gamerscore, and “hero” stats, that help keep the gamers engaged. Also, components such as Visual Studio, Mixer, DirectX, Azure App Center, Visual Studio, Visual Studio Code, and Havok are all a part of Game Stack. For more information, check out the official Microsoft Game Stack blog post. Microsoft open sources the Windows Calculator code on GitHub Microsoft open sources ‘Accessibility Insights for Web’, a chrome extension to help web developers fix their accessibility issue Microsoft researchers introduce a new climate forecasting model and a public dataset to train these models
Read more
  • 0
  • 0
  • 19357
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-unite-berlin-2018-keynote-unity-partners-with-google-launches-ml-agents-toolkit-0-4-project-mars-and-more
Sugandha Lahoti
20 Jun 2018
5 min read
Save for later

Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more

Sugandha Lahoti
20 Jun 2018
5 min read
Unite Berlin 2018, the Unity annual developer conference, kicked off on June 19’ 2018. This three-day extravaganza will take you through a thrilling ride filled with new announcements, sessions, and workshops from the amazing creators of Unity. It’s a place to develop, network, and participate with artists, developers, filmmakers, researchers, storytellers and other creators. Day 1 was inaugurated with the promising Keynote, presented by John Riccitiello, CEO of Unity Technologies. It featured previews of upcoming unity technology, most prominently Unity’s alliance with Google Cloud to help developers build connected games. Let’s take a look at what was showcased. Connected Games with Unity and Google Cloud Unity and Google Cloud have collaborated for helping developers create real-time multiplayer games. They are building a suite of managed services and tools to help developers, test, and run connected experiences while offloading the hard work of quickly scaling game servers to Google Cloud. Games can be easily scaled to meet the needs of the players. Game developers can harness the massive power of Google cloud without having to be a cloud expert. Here’s what Google Cloud with Unity has in store: Game-Server Hosting: Streamlined resources to develop and scale hosted multiplayer games. Sample FPS: A production-quality sample project of a real-time multiplayer game. New ECS Networking Layer: Fast, flexible networking code that delivers performant multiplayer by default. Unity ML-Agents Toolkit v0.4 A new version of Unity ML-Agents Toolkit was also announced at Unite Berlin. The v0.4 toolkit hosts multiple updates as requested by the Unity community. Game developers now have the option to train environments directly from the Unity editor, rather than as built executables. Developers can simply launch the learn.py script, and then press the “play” button from within the editor to perform training. They have also launched a set of two new challenging environments, Walker and Pyramids. Walker is physics-based humanoid ragdoll and Pyramids is a complex sparse-reward environment. There are also algorithmic improvements in reinforcement learning. Agents are now trained to learn to solve tasks that were previously learned with great difficulty. Unity is also partnering with Udacity to launch Deep Reinforcement Learning Nanodegree to help students and professionals gain a deeper understanding of reinforcement learning. Augmented Reality with Project MARS Unity has also announced their Project MARS, a Mixed and Augmented Reality studio, that will be provided as a Unity extension. This studio will require almost little-to-no custom coding and will allow game developers to build AR and MR applications that intelligently interact with any real-world environment, with little-to-no custom coding. Unite Berlin - AR Keynote Reel MARS will include abstract layers for object recognition, location, and map data. It will have sample templates with simulated rooms, for testing against different environments, inside the editor.  AR-specific gizmos will be provided to easily define spatial conditions like plane size, elevation, and proximity without requiring code or precise measurements. It will also have elements such as face masks, to avatars, to entire rooms of digital art. Project MARS will be coming to Unity as an experimental package later this year. Unity has also unveiled a Facial AR Remote Component. Powered by Augmented Reality, this component can perform and capture animated characters, allowing filmmakers and CGI developers to shoot CG content with body movement, just like you would with live action. Kinematica - Machine Learning powered Animation system Unity also showcased their AI research by announcing Kinematica, an all-new ML-powered animation system. Kinematica overpowers traditional animation systems which generally require animators to explicitly define transitions. Kinematica does not have any superimposed structure, like graphs or blend trees. It generates smooth transitions and movements by applying machine learning to any data source. Game developers and animators no longer need to manually map out animation graphs. Unite Berlin 2018 - Kinematica Demo Kinematica decides in real time how to combine data clips from a single library into a sequence that matches the controller input, the environment content, and the gameplay requests. As with Project MARS, Kinematica will also be available later this year as an experimental package. New Prefab workflows The entire Prefab systems have been revamped with multiple improvements. This improved Prefab workflow is now available as a preview build. New additions include Prefab Mode, prefab variance, and nested prefabs. Prefab Mode allows faster, efficient, and safer editing of Prefabs in an isolated mode, without adding them to the actual scene. Developers can now edit the model prefabs, and the changes are propagated to all prefab variants. With Nested prefabs, teams can work on different parts of the prefab and then come together for the final asset. Predictive Personalized Placements Personalized placements bring the best of both worlds for players and the commercial business. With this new feature, game developers can create tailor-made game experiences for each player. This feature runs on an engine which is powered by predictive analytics. This prediction engine determines what to show to each player based on what will drive the highest engagement and lifetime value. This includes ad, an IAP promotion, a notification of a new feature, or a cross-promotion. And the algorithm will only get better with time. These were only a select few of the announcements presented in Unity Berlin Keynote. You can watch the full video on YouTube. Details on other sessions, seminars, and activities are available on the Unite website. GitHub for Unity 1.0 is here with Git LFS and file locking support Unity announces a new automotive division and two-day Unity AutoTech Summit Put your game face on! Unity 2018.1 is now available
Read more
  • 0
  • 0
  • 19211

article-image-put-your-game-face-on-unity-2018-1-is-now-available
Sugandha Lahoti
07 May 2018
4 min read
Save for later

Put your game face on! Unity 2018.1 is now available

Sugandha Lahoti
07 May 2018
4 min read
Unity Technologies has announced the release of their latest platform update Unity 2018.1 giving artists, developers and game engineers the power to express their talents and collaborate more efficiently to build games. Unity 2018.1 also marks the start of a new release cycle. Since 2017, Unity has adopted a new release plan where they come up with a new version every quarter and Unity 2018.1 marks the first version of the 2018 series. According to Brett Bibby, VP of Engineering, Unity Technologies, “With Unity 2018.1 we are introducing one of the largest upgrades in the history of our company, and it’s centered around two major concepts - next-level rendering and performance by default,” This release features two new upgrades: the Scriptable Render Pipelines and the Entity Component System. Together they make it easier for creators to make richer experiences utilizing modern hardware to deliver beautiful graphics. Next-level rendering with Scriptable Render Pipeline (SRP) Scriptable Render Pipeline (SRP) is available in the preview of Unity 2018.1. With SRP, developers and technical artists can now work directly with hardware and GPUs without having to go through millions of lines of C++ engine code. SRP makes it easy to customize the rendering pipeline via C# code and material shaders.   Unity 2018.1 also introduces two render pipelines. The High-Definition Render Pipeline (HD RP) is for developers with AAA aspirations. The Lightweight Render Pipeline (LW RP) is for those looking for a combination of graphics and speed. It optimizes the battery life for mobile devices and other similar platforms. Performance by default with the C# Job System &  Entity Component System (ECS) The C# Job system enables developers to write very fast, parallelized code in C# to take full advantage of multicore processors. It also provides protection from the pitfalls of multi-threading, such as race conditions and deadlocks. The runtime system is now combined with a new programming model, the Entity Component System. This new runtime system enables developers to use multicore processors without worrying about the programming. They can use this power to add more effects and complexity to games or add AI to make their creations richer and more immersive. It uses a data-oriented design instead of an object-oriented approach which makes it easier to reuse the code and easier for others to understand and work on it as well. Level design and shaders Unity 2018.1 reduces the time and effort required by artists, designers, and developers by allowing them to create levels, cinematic content, and gameplay sequences without coding. For this, new tools like ProBuilder/Polybrush and the new visual Shader Graph offer intuitive ways to design levels and create shaders without programming skills. ProBuilder is a unique hybrid of 3D-modeling and level-design tools optimized for building simple geometry, but capable of detailed editing and UV unwrapping as needed. With Polybrush developers can blend textures and colors, sculpt meshes and scatter objects directly in the Unity editor. Shader Graph can build shaders visually using a designer tool — without writing a single line of code. They offer easy drag-and-drop usability to create and connect nodes in a graph network. Unity Package Manager UI Unity 2018.1 builds on the package manager introduced in Unity 2017.2. It has a newly released Package Manager User Interface, the Hub, and Project Templates, to help start new projects faster and more efficiently. The Unity Package Manager UI  improves the following aspects of the project management workflow: Quick access to newly released features Get the latest fixes, instantly Access to Preview features Easily share lightweight projects Unity 2018.1 offers support for over 25+ platforms. This includes Magic Leap One, Oculus Go, ARCore 1.1, Android ARM64, Daydream Standalone and more. You can refer to the release notes for the full list of new features, improvements, and fixes. Unity will be showcasing all their latest innovations during Unite Berlin scheduled on June 19 - 21, 2018. Unity plugins for augmented reality application development Game Engine Wars: Unity vs Unreal Engine Unity releases ML-Agents v0.3: Imitation Learning, Memory-Enhanced Agents and more  
Read more
  • 0
  • 0
  • 18845

article-image-epic-games-at-gdc-announces-epic-megagrants-rtx-powered-ray-tracing-demo-and-free-online-services-for-game-developers
Natasha Mathur
22 Mar 2019
4 min read
Save for later

Epic Games announces: Epic MegaGrants, RTX-powered Ray tracing demo, and free online services for game developers

Natasha Mathur
22 Mar 2019
4 min read
Epic Games, an American video game and software development company, made a series of announcements, earlier this week. These include: Epic Game’s CEO, Tim Sweeney to offer $100 million in grants to game developers Stunning RTX-powered Ray-Tracing Demo named Troll Epic’s free Online Services launch for game developers Epic MegaGrants: $100 million funds to Game Developers Tim Sweeney, CEO, Epic Games Inc, announced earlier this week that he will be offering $100 million in grants to game developers to boost the growth of the gaming industry. Sweeney made the announcement during a presentation on Wednesday at the Game Developers Conference (GDC). GDC is the world's largest professional game industry event that ended yesterday in San Francisco. Epic Games also created a $5 million fund for grants that have been disbursed over the last three years. Now Epic Games is off to build a new fund called Epic MegaGrants. These are “no-strings-attached” grants, meaning that they don’t consist of any contracts requiring game developers to do anything for Epic. All that game developers need to do is apply for the grants, create an innovative project, and if the Epic’s judges find it worthy, they’ll offer them the funds. “There are no commercial hooks back to Epic. You don’t have to commit to any deliverables. This is our way of sharing Fortnite’s unbelievable success with as many developers as we can”, said Sweeney. Troll: a Ray Tracing Unreal Engine 4 Demo Another eye-grabbing moment at GDC this year was a “visually stunning” ray tracing demo revealed by Goodbye Kansas and Deep Forest Films called "Troll”. Troll was rendered in real time using Unreal Engine 4.22 ray tracing and camera effects. And powered by a NVIDIA’s single GeForce RTX 2080 Ti graphics card.  Troll is visually inspired by Swedish painter and illustrator John Bauer, whose illustrations are famous for Swedish folklore and fairy tales anthology known as ‘Among Gnomes and Trolls’. https://www.youtube.com/watch?v=Qjt_MqEOcGM                                                            Troll “Ray tracing is more than just reflections — it’s about all the subtle lighting interactions needed to create a natural, beautiful image. Ray tracing adds these subtle lighting effects throughout the scene, making everything look more real and natural,” said Nick Penwarden, Director of Engineering for Unreal Engine at Epic Games. NVIDIA team states in a blog post that Epic Games has been working to integrate RTX-accelerated ray tracing into its popular Unreal Engine 4. In fact, Unreal Engine 4.22 will have the support for new Microsoft DXR API for real-time ray tracing. Epic’s free online services launch for game developers Epic Games also announced the launch of free tools and services, part of the Epic Online Services, which was announced in December 2018. The SDK is available via the new developer portal for immediate download and use. SDK currently supports Windows, Mac, and Linux. Moreover, the SDK, as a part of the release, provides support for two free services, namely, game analytics and player ticketing. Game analytics help developers understand player behavior. It features DAU (Daily active users), MAU (Monthly active users), retention, new player counts, game launch counts, online user count, and more. The ticketing system connects players directly with developers and allows them to report bugs or other problems. These two services will continue to evolve along with the rest of Epic Online Services (EOS) to offer infrastructure and tools required by the developers to launch, operate, and scale the high-quality online games. Epic games will also be offering additional free services throughout 2019, including player data storage, player reports, leaderboards & stats, player identity, player inventory, matchmaking etc. “We are committed to developing EOS with features that can be used with any engine, any store and that can support any major platform...these services will allow developers to deliver cross-platform gameplay experiences that enable players to enjoy games no matter what platform they play on”, states the Epic Games team. Fortnite server suffered a minor outage, Epic Games was quick to address the issue Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android installer Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned
Read more
  • 0
  • 0
  • 18742

article-image-us-labor-organization-afl-cio-writes-an-open-letter-to-game-developers-urging-them-to-unionize-for-fair-treatment-at-work
Natasha Mathur
18 Feb 2019
3 min read
Save for later

US Labor organization, AFL-CIO writes an open letter to game developers, urging them to unionize for fair treatment at work

Natasha Mathur
18 Feb 2019
3 min read
The American Federation of Labor and Congress of Industrial Organizations (AFL-CIO), the largest labour organization in the United States, published an open letter on Kotaku, a video game website and blog, last week. The letter urges the video game industry workers to unionize and voice their support for better treatment within the workplace. The letter is from secretary-treasurer Liz Shuler and this is the first time when AFL-CIO has made a public statement about unionizing game developers. Shuler talks about the struggles of game developers and the unfair treatment that they go through in terms of work conditions, job instability, and inadequate pay in the letter.  Shuler mentions that although U.S. video game sales reached $43 billion in 2018 ( which is 3.6 times larger than the film industry’s record-breaking box office) and is a “stunning accomplishment” for the game developers, they are still not getting the respect that they deserve.   “You’ve built new worlds, designed new challenges and ushered in a new era of entertainment. Now it’s time for industry bosses to start treating you with hard-earned dignity and respect”, writes Shuler. She mentions that game developers often work for outrageous hours in a stressful and toxic work condition, unable to ask for better due to the fear of losing their jobs. She gives an example of developers at Rockstar Games who shared their experiences of  “crunch time” (when the pressure to succeed is extreme) lasting months and sometimes even years to meet the unreal demands from management and deliver a game that made their bosses earn $725 million in its first three days. “They get rich. They get notoriety. They get to be crowned visionaries and regarded as pioneers. What do you get?”, writes Shuler. According to Shuler, this is a moment for change and change will come when developers come together as a strong union by using their “collective voice” to ask for a “fair share of wealth” that the game developers create every day. She writes that the CEOs and the bosses would treat the developers right only when they stand together and demand it. “You have the power to demand a stake in your industry and a say in your economic future. Whether we’re mainlining caffeine in Santa Monica, clearing tables in Chicago or mining coal in West Virginia, we deserve to collect nothing less than the full value of our work”, states Shuler. Public reaction to the news is mostly positive, with some people calling out for a better and stronger alternative than unions: https://twitter.com/kwertzy/status/1096471380357349376 https://twitter.com/getglitched/status/1096499209719685120 https://twitter.com/moesidegaming/status/1096666233011871744 https://twitter.com/legend500/status/1096571646805188608 https://twitter.com/turnageb/status/1096481116763107328 Check out the complete letter here. Open letter from Mozilla Foundation and other companies to Facebook urging transparency in political ads Google TVCs write an open letter to Google’s CEO; demands for equal benefits and treatment The cruelty of algorithms: Heartbreaking open letter criticizes tech companies for showing baby ads after stillbirth
Read more
  • 0
  • 0
  • 18658
article-image-researcher-shares-a-wolfenstein-real-time-ray-tracing-demo-in-webgl1
Bhagyashree R
18 Mar 2019
3 min read
Save for later

Researcher shares a Wolfenstein real-time ray tracing demo in WebGL1

Bhagyashree R
18 Mar 2019
3 min read
Last week, Reinder Nijhoff, a computer vision researcher, created a project that does real-time ray tracing in WebGL1 using Nvidia’s RTX graphics card. This demo was inspired by Metro's real-time global illumination. https://twitter.com/ReinderNijhoff/status/1106193109376008193 This demo uses a hybrid rendering engine created using WebGL1. It renders all the polygons in a frame with the help of traditional rasterization technologies and then combines the result with ray traced shadows, diffuse GI, and reflections. Credits: Reinder Nijhoff What is the ray tracing technique? In computer graphics, ray tracing is a technique for rendering 3D graphics with very complex light interactions. Basically, in this technique, an algorithm traces the path of light and then simulates the way the light will interact with the virtual objects. There are three ways light interacts with the virtual objects: It can be reflected from one object to another causing reflection. It can be blocked by objects causing shadows. It can pass through transparent or semi-transparent objects causing refractions. All these interactions are then combined to determine the final color of a pixel. Ray tracing has been used for offline rendering due to its ability to accurately model the physical behavior of light in the real world. Due to its computationally intensive nature, ray tracing was often not the first choice for real-time rendering. However, this changed with the introduction of Nvidia RTX graphics card as it adds custom acceleration hardware and makes real-time ray tracing relatively straightforward. What was this demo about? The project’s prototype was based on a forward renderer that first draws all the geometry in the scene. Next, the shader used to rasterize (converting an image into pixels) the geometry, calculates the direct lighting. Additionally, the shader also casts random rays from the surface of the rendered geometry to collect the indirect light reflection due to non-shiny surfaces using a ray tracer. The author started with a very simple scene for the prototype that included a single light and rendered only a few spheres and cubes. This made the ray tracing code pretty much straightforward. Once the prototype was complete, he wanted to take the prototype to the next level by adding more geometry and a lot of lights to the scene. Despite the complexity of the environment, Nijhoff wanted to perform ray tracing of the scene in real-time. Generally, to speed up the ray trace process, a bounding volume hierarchy (BVH) is used as an acceleration structure. However, when using WebGL1 shaders it is difficult to pre-calculate and use BVH. This is why Nijhoff decided to use a Wolfenstein 3D level for this demo. To know more in detail, check out the original post shared by Reinder Nijhoff. Unity switches to WebAssembly as the output format for the Unity WebGL build target NVIDIA shows off GeForce RTX, real-time raytracing GPUs, as the holy grail of computer graphics to gamers Introducing SCRIPT-8, an 8-bit JavaScript-based fantasy computer to make retro-looking games  
Read more
  • 0
  • 0
  • 18134

article-image-game-developers-excited-about-unity-2018-2
Amarabha Banerjee
26 Jun 2018
3 min read
Save for later

What’s got game developers excited about Unity 2018.2?

Amarabha Banerjee
26 Jun 2018
3 min read
The undisputed leader of game engines over the last few years has been Unity. It brings .NET professionals and enthusiasts from across the globe under the gaming umbrella with its C# game scripting feature . Unity also boasts of a very active community and an even busier release schedule. Unity has been following a semantic versioning. Under this scheme, version numbers and the way they change convey meaning about the underlying code and what has been modified from one version to the next.  Unity have just released their 2018.2 beta version. Here are some exciting features you can look forward to while working in Unity 2018.2. Texture Mipmap streaming feature: If you are a game developer then saving GPU memory is probably one of your top priorities. Unity 2018.2 gives you control over which graphical map or mipmap you will load in the CPU. The previous versions used to load all the mipmaps at the same time and hence put a huge amount of load on the GPU. While this memory allocation helps in reducing GPU load, it does increase a little bit of CPU load. Improved Package manager: Unity 2018.2 comes with an improved package manager. The improvements are in the UI font, and the status of package label. It also now has the ability to dock the window and provides easy access to both documentation and the list of changes. Improvements in the Particle system: Unity 2018.2 beta comes with an improved particle system and new scripting APIs for baking the geometry of a Particle System into a Mesh. Unity now allows up to eight texture coordinates to be used on meshes and passed to shaders. Particle Systems will also now convert their colors into linear space, when appropriate, before uploading them to the GPU. Camera Improvements: Unity has come up with some major improvements in their camera and and the way it functions and renders the objects in the game to portray them like real life objects. Animation Jobs C# API: Unity 2018.2 has improved the AnimationPlayables by allowing users to write their own C# Playables that can interact directly with the animation data. This allows integration of user made IK solvers, procedural animation or even custom mixers into the current animation system. These features along with some other improvements and bug fixes are sure to help the developers create better and smarter games with the latest Unity 2018.2. To know more on the Unity 2018.2 features, you can visit the official Unity blog. How to use arrays, lists, and dictionaries in Unity for 3D game development Build an ARCore app with Unity from scratch Implementing lighting & camera effects in Unity 2018
Read more
  • 0
  • 0
  • 17924

article-image-adobe-acquires-allegorithmic-a-popular-3d-editing-and-authoring-company
Amrata Joshi
24 Jan 2019
3 min read
Save for later

Adobe Acquires Allegorithmic, a popular 3D editing and authoring company

Amrata Joshi
24 Jan 2019
3 min read
Yesterday, Adobe announced that it has acquired Allegorithmic. Allegorithmic is the creator of Substance, and other 3D editing and authoring tools for gaming, entertainment, and post-production. Allegorithmic’s customer base is diverse ranging across gaming, film and television, e-commerce, retail, automotive, architecture, design and advertising industries. Popular Algorithmic users include Electronic Arts, Ubisoft, Ikea, BMW, Louis Vuitton, Foster + Partners among others. Allegorithmic’s Substance tools are used in games, such as Call of Duty, Assassin’s Creed, and Forza. Allegorithmic’s tools have been used for visual effects and animation for some of the popular movies like Blade Runner 2049, Pacific Rim Uprising, and Tomb Raider. Adobe will help in accelerating Allegorithmic’s product roadmap and go-to-market strategy and further extend its reach among enterprise, SMB, and individual customers. Sebastien Deguy, CEO and founder at Allegorithmic will take up a leadership role as the vice president and handle Adobe’s broader 3D and immersive designs. With this acquisition, Adobe also wants to make Creative Cloud (a set of applications and services from Adobe Systems that gives users access to a collection of software used for graphic design, video editing and more) the home to 3D design tools. How will Creative Cloud benefit from Allegorithmic Adobe and Allegorithmic previously worked together three years ago. As the result of their work, Adobe introduced a standard PBR material for its Project Aero, Adobe Dimension, Adobe Capture, and every 3D element in Adobe Stock. Now, Adobe will empower video game creators, VFX artists, designers, and marketers by combining Allegorithmic’s Substance 3D design tools with Creative Cloud’s imaging, video and motion graphics tools. Creative Cloud can benefit from Allegorithmic’s tools in gaming, entertainment, retail and even for designing textures and materials that give 3D content detail and realism. Creative Cloud tools such as Photoshop, Premiere Pro, Dimension, and After Effects are already in use and are of great significance for content creators, the addition of Allegorithmic’s Substance tools to Creative Cloud would turn out to be more powerful. In a blog post, Scott Belsky, chief product officer and executive vice president at Creative Cloud, said, “Our goal with Creative Cloud is to provide creators with all the tools they need for whatever story they choose to tell. Increasingly, stories are being told with 3D content. That’s why I’m excited to announce that today Adobe has acquired Allegorithmic, the industry standard in tools for 3D material and texture creation for gaming and entertainment.” Sebastien Deguy, said, “Allegorithmic and Adobe share the same passion for bringing inspiring technologies to creators. We are excited to join the team, bring together the strength of Allegorithmic’s industry-leading tools with the Creative Cloud platform and transform the way businesses create powerful, interactive content and experiences.” In future, Adobe might focus on making Allegorithmic tools available via subscription. Some users are concerned about the termination of the perpetual license and are unhappy about this news. It would be interesting to see the next set of updates from the team at Adobe. https://twitter.com/sudokuloco/status/1088101391871107073 https://twitter.com/2017_nonsense/status/1088181496710479872 Adobe set to acquire Marketo putting Adobe Experience Cloud at the heart of all marketing Adobe glides into Augmented Reality with Adobe Aero Adobe to spot fake images using Artificial Intelligence
Read more
  • 0
  • 0
  • 17886
article-image-godot-3-1-released-with-improved-c-support-opengl-es-2-0-renderer-and-much-more
Savia Lobo
15 Mar 2019
4 min read
Save for later

Godot 3.1 released with improved C# support, OpenGL ES 2.0 renderer and much more!

Savia Lobo
15 Mar 2019
4 min read
On 13 March, Wednesday, the Godot developers announced the release of a new version of the open source 2D and 3D cross-platform compatible game engine, Godot 3.1. This new version includes the much-requested improvements to the major release, Godot 3.0. Improved features in the Godot 3.1 OpenGL ES 2.0 renderer Rendering is done entirely on sRGB color space (the GLES3 renderer uses linear color space). This is much more efficient and compatible, but it means that HDR will not be supported. Some advanced PBR features such as subsurface scattering are not supported. Unsupported features will not be visible when editing materials. Some shader features will not work and throw an error when used. Also, some post-processing effects are not present either. Unsupported features will not be visible when editing environments. GPU-based Particles will not work as there is no transform feedback support. Users can use the new CPUParticles node instead. Optional typing in GDScript This has been one of the most requested Godot features from day one. GDScript allows to write code in a quick way within a controlled environment. The code editor will now show which lines are safe with a slight highlight of the line number. This will be vital in the future to optimize small pieces of code which may require more performance. Revamped Inspector The Godot inspector has been rewritten from scratch. It includes features such as proper vector field editing, sub-inspectors for resource editing, better custom visual editors for many types of objects, very comfortable to use spin-slider controls, better array and dictionary editing and many more features. Kinematicbody2d (and 3d) improvements Kinematic bodies are among Godot's most useful nodes. They allow creating very game-like character motion with little effort. For Godot 3.1 they have been considerably improved with: Support for snapping the body to the floor. Support for RayCast shapes in kinematic bodies. Support for synchronizing kinematic movement to physics, avoiding a one-frame delay. New Axis Handling system Godot 3.1 uses the novel concept of "action strength". This approach allows using actions for all use cases and it makes it very easy to create in-game customizable mappings and customization screens. Visual Shader Editor This was a pending feature to re-implement in Godot 3.0, but it couldn't be done in time back then. The new version has new features such as PBR outputs, port previews, and easier to use mapping to inputs. 2D Meshes Godot now supports 2D meshes, which can be used from code or converted from sprites to avoid drawing large transparent areas. 2D Skeletons It is now possible to create 2D skeletons with the new Skeleton2D and Bone2D nodes. Additionally, Polygon2D vertices can be assigned bones and weight painted. Adding internal vertices for better deformation is also supported. Constructive Solid Geometry (CSG) CSG tools have been added for fast level prototyping, allowing generic primitives and custom meshes to be combined via boolean operations to generate more complex shapes. They can also become colliders to test together with physics. CPU-based particle system Godot 3.0 integrated a GPU-based particle system, which allows emitting millions of particles at little performance cost. The developers added alternative CPUParticles and CPUParticles2D nodes that perform particle processing using the CPU (and draw using the MultiMesh API). These nodes open the window for adding features such as physics interaction, sub-emitters or manual emission, which are not possible using the GPU. More VCS-friendly The new 3.1 version includes some very requested enhancements such as: Folded properties are no longer saved in scenes. This avoids unnecessary history pollution. Non-modified properties are no longer saved. This reduces text files considerably and makes history even more readable. Improved C# support In Godot 3.1, C# projects can be exported to Linux, macOS, and Windows. Support for Android, iOS, and HTML5 will come soon. To know about other improvements in detail, visit the changelog or the official website. Microsoft announces Game stack with Xbox Live integration to Android and iOS OpenAI introduces Neural MMO, a multiagent game environment for reinforcement learning agents Google teases a game streaming service set for Game Developers Conference
Read more
  • 0
  • 0
  • 17838

article-image-fortnite-creator-epic-games-launch-epic-games-store-where-developers-get-88-of-revenue-earned-challenging-valves-dominance
Sugandha Lahoti
05 Dec 2018
3 min read
Save for later

Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned; challenging Valve’s dominance

Sugandha Lahoti
05 Dec 2018
3 min read
The Game studio, who brought the phenomenal online video game Fortnite to life, has launched an Epic games store. In a blog post on the Unreal Engine website, Epic stated that the store will have a “fair economics and a direct relationship with players”. All players who buy a game will be subscribed to a developer’s newsfeed where they can contact them for updates and news about upcoming releases. Developers can also control their game pages and connect with YouTube content creators, Twitch streamers, and bloggers with the recently launched Support-A-Creator program. Epic games store will also follow an 88/12 revenue split. “Developers receive 88% of revenue,” the company wrote. “There are no tiers or thresholds. Epic takes 12%. And if you’re using Unreal Engine, Epic will cover the 5% engine royalty for sales on the Epic Games store, out of Epic’s 12%.” Source: Unreal Engine Epic’s inspiration for the 88/12 split may have possibly been taken from Valve’s Steam store (a major competitor to Epic games) who have tweaked their revenue making process. “Starting from October 1, 2018, when a game makes over $10 million on Steam, the revenue share for that application will adjust to 75 percent/25 percent on earnings beyond $10 million,” Valve wrote in the official blog post. “At $50 million, the revenue share will adjust to 80 percent/20 percent on earnings beyond $50 million. The Epic game store with launch with a few selected games on PC and Mac, then it will open up to other games and to Android and other open platforms throughout 2019. With this move, Epic Games are looking to attract more gamers and developers to their platform. And a better revenue split will automatically do most of the work for them. Developer-favour revenue splitting will also increase the market where previously there was a lack of competition in PC-game distribution by the immovable 30/70 split. Twitteratis were fairly happy with this announcement and expressed their feelings and agreed on it to being a threat to Valve. https://twitter.com/Grummz/status/1069975572984385537 https://twitter.com/SpaceLyon/status/1069979966501208065 https://twitter.com/lucasmtny/status/1069970212424953857 https://twitter.com/nickchester/status/1069970684112265217 The Epic Games team will reveal more details on upcoming game releases at the Game Awards this Thursday. Read the blog post by Epic games to know more. Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android Installer before patch was ready Google is missing out $50 million because of Fortnite’s decision to bypass Play Store Implementing fuzzy logic to bring AI characters alive in Unity based 3D games
Read more
  • 0
  • 0
  • 17219
Modal Close icon
Modal Close icon