Advanced OUYA Functions - Graphics and Controls - Getting Started with OUYA (2014)

Getting Started with OUYA (2014)

Chapter 7. Advanced OUYA Functions - Graphics and Controls

This chapter assesses the essential tools needed to manage multiple controllers and shaders for the graphical optimization of the OUYA console.

This chapter allows you to learn more about the tools for the use of multiple controllers and graphical optimization with shaders, and gives an introduction to lighting on the OUYA console.

Advanced OUYA Functions – Graphics and Controls

Basic materials

A shader is code that is executed on the Graphics Processing Unit (GPU), usually found with a graphic card. Shaders are applied to create rendering effects such as fire, lights, and toon effects.

Shaders are simple programs that transform a vertex or a pixel to manipulate the textures and lighting of a material or surface. You can use them whenever you want to generate or manipulate lighting effects on textures and materials. You can use different programming languages to write a shader, from high-level programming languages such as C, to low-level languages such as assembly languages.

The shader is a programming script that is used to produce appropriate levels of light and color in an image. Depending on the type of API that is used by the video game engine, you will get intermediate or advanced scenes with lightmapping and global illumination.

Some issues that need to be taken into consideration to optimize shader performance are reducing the amount of particles and increasing the opacity of materials by using textures creatively. Remember that the OUYA console supports a resolution of 720p or 1080p output only. More information on this can be found at http://docs.unity3d.com/Documentation/Manual/Materials.html.

Understanding shaders

A shader is used to perform graphics transformations and create special effects such as lighting, lenses, lights, water, fire, fog, or particle systems. A shader integrated in Unity3D would look as follows:

Understanding shaders

The following are the prerequisites to obtain optimal knowledge of shaders and use them properly:

· Scripting functions, classes, and variables

· A NextGen game, which is a very expensive cost render that accepts diffuse, specular, diffuse, emissive, and parallax

· Cartoon games have a very low-cost render that accepts only diffuse

· A basic knowledge of mathematics, especially quadratic functions and transformational math, in short, basic linear algebra

· Trigonometric calculations to calculate the intensity of environmental light

· All the textures we load have to be powers of 2, such as 64 x 64, 128 x 64, and many others

The use of shaders needs a language that is created for it. The use of shaders is based on languages such as ShaderLab, GLSL, and CG. The Tegra processor of OUYA allows several shaders with languages such as Boo, JavaScript, or Microsoft Visual C#. Some of the languages are explained as follows:

· ShaderLab: This is the shader language at the core of Unity3D, and is required by all shaders. It is a very basic language.

· CG: This is a shader language developed by Nvidia. It is powerful and an easy language to learn. The CG shader is the most widely used shader in Unity3D because it uses a high-level programming language.

· GLSL: This is a shader language that is used for mobile devices; it is a basic language and is very similar to the CG shader language.

The process of loading the texture memory, OpenGL is not like. Algorithms to implement lighting with shaders are based on powers of 2, for example, 64 x 64 and 1024 x 1024.

To create shaders, navigate to Assets | Create | Shader of menubar in Unity. The shader can be edited by doubling-clicking on the Project view. The following is a structure that represents a basic shader:

Shader "Tutorial/Basic" {

Properties {

_Color ("Main Color", Color) = (1,0.5,0.5,1)

}

SubShader {

Pass {

Material {

Diffuse [_Color]

}

Lighting On

}

}

}

The algorithm allows us to use several shaders, assets, items, and characters in the game scene.

The following parameters of the angles of light should be modified:

· Transparency application

· Light reflectivity

· Filtering application

Types of shader processors

The programmer sends a set of vertices to form the scene graph. All vertices are processed by a vertex shader where they can be transformed. This vertex also determines their texture mapping.

Texture mapping can be summarized with the following three characteristics:

· It loads an image and is defined as a texture

· It indicates how the texture is applied to each pixel of the surface

· It indicates the correspondence between the coordinates

It produces a level of color that changes according to the proximity of the illuminated objects.

The geometry shader's vertices can be deleted or added. Finally, the vertices are assembled into primitives that are tracked, that is, they are transformed into pixels on the graphic display. The pixel shaders receive individual pixels and perform calculations on them to determine things such as color and lighting.

The vertex or fragment shader is responsible for modifying these points in depth, and that creates a high-quality image. Fragment and pixel shaders have the same principle. It is also used by the Tegra 3 processor, OUYA's graphics processor, which that performs the optimization process while rendering the game scene.

Types of shader processors

Shader processors

To understand more about different shaders, it is necessary to know the differences with regard to the structure and complexity. The shaders are explained as follows:

· Vertex shaders: This shader's programming is simple; it uses the C language. It modifies the vertex position and color and textures coordinates. It gets information about the vextex and is responsible for transforming its coordinates, texture coordinates, normal, and many other parameters.

· Vertex/fragment shaders: These shaders are a little more complex because they divide 3D models according to the intensity scales of a color. These shaders are more complex to code and then calculate the color of individual pixels. They are more complete, but highly editable. Note that Direct X refers to them as pixel shaders, and Open GL calls these fragment shaders. You can get the information associated with each pixel occupied by the geometry of a model on the screen; this is responsible for the final color.

· Fixed function: This is usually used in devices that have limitations of pixels or low capacity. This is written entirely in ShaderLab, and is used in older devices.

To observe the real difference between the shaders in our scene, it should be understood that the post-effects are variable. You can only see the shaders in the Unity3D Professional version or the 30-day trial version of the game engine offered by the company. The shader assembly language code can only be opened and viewed using Unity Professional.

The following screenshot shows lighting without the CG shader and without using SSOA (the occlusion system). The part to the left shows the standard shader and the part to the right shows the shader using the Unity3D Professional version.

Types of shader processors

To optimize our models and increase performance, we should take into account materials, assets, and items (meshes). We also need to consider the following tips:

· Do not leave too many small objects in the scene; it is better to combine them and create a single object

· Do not combine objects that are far away, because when the camera sees the corner of one, it will rebuild all the objects

· If you have an object that has multiple materials, it is best not to combine them, as it will reduce performance

The CG programming language for Nvidia Tegra 3

C for Graphics (CG) is a high-level shading language developed by Nvidia in close collaboration with Microsoft, to program vertex and pixel shaders. CG is based on the C programming language and they both share the same syntax.

The following is a sample CG vertex shader:

The CG programming language for Nvidia Tegra 3

Characteristics of the Tegra 3 processor

The Tegra 3 processor is a mobile multicore processor from Nvidia, used in the OUYA console. It has the following characteristics:

· Supports Physics-based shaders for natural projection of light. It also contains functions to allow the manipulation of transparency and opacity of objects.

· Reflection shaders to be used in effects of particles.

· Combining normal maps for improving the graphical quality.

· Optimization of polygons reduces the size of the game.

· Optimizes global illumination and generates environments that are more realistic.

· Ambient occlusion.

· Skin rendering.

Visit http://unity3d.com/unity/download/archive/ to download shaders for Unity3D.

Lighting

The lighting of a scene is very important because the 3D models can interact to produce simple or double light sources or light probes that produce different lighting intensities in the game.

For global illumination in the 3D engine, it is necessary to save the scene with the 3D models and light sources as static components. The 3D models and lights are static components because they don't have any Physics that generate movement. This means that static game objects do not interact with PhysX.

Remember that Unity3D uses the following application programming interfaces:

· Windows Direct x11

· Open GL on Mac and Linux

· Open GL-ES on Android and iOS

The steps to set up the lighting feature in Unity are as follows:

1. Open the maya.unity scene located in the Assets folder. To use lightmapping in our maya.unity scene, select the toolbar and select the Unity3D path in the tool panel by navigating to Window | Lightmapping.

Unity3D integrates lightmaps and advanced illuminations for online game and mobile device design.

Lighting

2. Immediately, you will see a window with three sections (Object, Bake, and Maps). Unity3D has an integrated lightmapper named Beast Lightmap. You can create global illumination maps covering lights and shadows of the scene models from the Unity3D IDE, thus increasing performance.

The values of the Final GatherRays and Resolution parameters in the Bake section are 200 and 10 respectively, as shown in the following screenshot:

Lighting

3. Then we click on Bake and generate our lightmapping after a few minutes (the time taken depends on the number of 3D models). The map of lights and shadows will be created by Unity3D for all scenes.

4. The lightmapping's results depend on the reproduction routes. To modify them without having to retake the lightmapping for the whole scene, you should navigate to File | Build Settings | Player Settings | Other Settings. Make the appropriate changes to the player setting and the main camera, as shown in the following screenshot:

Lighting

Ligthmapping player settings

It is necessary to take into account aspects such as lights and lightmaps, so keep the following things in mind:

· Do not use more than one light with shadows Dynamics by scene.

· By using light probes, we can optimize the game scene and make the scenes more realistic. You can add light probes to any object in the scene (by navigating to Component | Rendering | Light Probe Group). The light probes appear as yellow spheres that can be positioned in the same manner as the game object, as shown in the following screenshot:

Lighting

Draw Calls is a tool that improves the graphic quality of a video game. More information on lightmapping is available on the official website of Unity3D, at http://docs.unity3d.com/Documentation/Manual/Lightmapping.html.

Multiple controls

In this section, you will learn how to use several controllers that independently manage different characters.

This script will return a subclass of the GameController class. This depends on the controller; it can also return an OuyaController class. You should check what is returned and type a set accordingly.

We can create a script by performing the following steps:

1. The first step in establishing our second player is to select the SkeletonData folder by navigating to Assets | Ouya | Examples | Models and incorporate it into the scene. The 3D model assigned to the controller will look as shown in the following screenshot:

Multiple controls

2. For this section, it is important to note that the discrimination between the controls of each character stem from the OuyaInputHandler class, as shown in the following code:

Public class OuyaInputHandlerExample : MonoBehaviour, OuyaSDK.IPauseListener, OuyaSDK.IResumeListener

3. The OuyaInputHandlerExample.cs file has an enumeration that is used for the entry that corresponds to a specific control, and this enumeration is as follows:

public OuyaSDK.OuyaPlayer player;

Multiple controls

Multiple controls

4. It is important to note that you should incorporate the character controller script and set the animations in the new character, as shown in the following screenshot:

Multiple controls

Animation settings

5. The OuyaInputHandlerExample.cs file has a filter that links a GameObject to each player. By doing this, we ensure that the various control actions belong to a specific character. This is shown in the following code:

6. void HandleButtonEvent(OuyaSDK.OuyaPlayer p, OuyaSDK.KeyEnum b, OuyaSDK.InputAction bs)

7. {

8. if (!player.Equals(p)) { return; }

9.

10. if (b.Equals(OuyaSDK.KeyEnum.BUTTON_O) && bs.Equals(OuyaSDK.InputAction.KeyDown))

11. {

12. this.animation.Play("attack");

13. }

}

The following controller button mapping list is a guide for the newly designed OUYA controller. There exist multiple controller types, and these keys are specific to those controllers with a setup that includes four buttons, two triggers, two shoulder buttons, two analog sticks, and a directional D-pad, as shown in the following code:

controller.o - Button - The O button

controller.u - Button - The U button

controller.y - Button - The Y button

controller.a - Button - The A button

controller.lb - Button - The L1 shoulder button

controller.lt - Trigger - The L2 shoulder trigger

controller.rb - Button - The R1 shoulder button

controller.rt - Trigger - The R2 shoulder trigger

controller.leftStick - Joystick - The left ouya joystick

controller.rightStick - Joystick - The right ouya joystick

controller.dpad - Directional Pad - The directional pad

Common problems

In this section, we will deal with the commonly faced issues in OUYA while using multiple controllers.

Controller always pairing as the second controller

Firstly, unplug any USB mouse, USB keyboard, USB HDD, or another USB controller. Some users have reported these taking up a controller slot. If that fails, or you don't have anything else plugged in, try navigating to Manage | System | Advanced | Bluetoothand check if there is a phantom controller in the list (one that isn't the one you are using) and delete it from the list.

The second OUYA controller paired as the third controller

This is a situation where you paired the second controller and it paired itself as the third controller, even though you only had two controllers connected. We need the second controller to be recognized by the console as controller number 2.

To correct this error, perform the following steps:

1. Navigate to Manage | Controllers.

2. Go to Bluetooth, and using the malfunctioning controller, select the OUYA Game Controller option and end the connection.

3. Go back and try and repair it.

Summary

In this chapter, we evaluated the different shaders and their processors to create optimized games for the OUYA console. We also covered how to use multiple controllers.

In the next chapter, we will evaluate the future of the OUYA console and other technologies that help develop new hardware tools for it.