Working with the Game HUD
Namespace: $GG.ui.hud
Related Topics
The Gig Game HUD (Heads-Up Display) is a critical component designed to enhance the gaming experience by providing a real-time display of game information, player interactions, and multimedia content. Here's a breakdown of the technical components and functionality based on the provided code:
HUD Launcher
The HUD (Heads-Up Display) Launcher ($GG.ui.hud.launcher
) in Gig Game is a critical component that manages the lifecycle and functionality of the HUD window. The HUD window displays all real-time game information, multimedia content, and player interactions. The launcher provides methods to open, close, and interact with this HUD window, ensuring a seamless and interactive user experience.
Note: The HUD utilizes the
$GG.ui
library for graphic management. Therefore, you cannot use$GG.ui
for both the game console and the HUD simultaneously. Additionally, when using the HUD, it is not necessary to initialize$GG.ui.initialize
beforehand, as it is automatically done when the HUD opens.
For More Information on Launcher Endpoints, click here
Example Usage
To open a HUD window and set it to a specific stage:
$GG.ui.hud.launcher.open('myStage', { skipIntro: true })
.then(stage => {
console.log('HUD stage opened:', stage);
// Queue actions for the HUD
stage.queueText('Welcome to the game!', 5000, 'fade');
})
.catch(error => {
console.error('Error opening HUD:', error);
});
HUD Stage
The $GG.ui.hud.types.hudBase
class is a central component in the Gig Game HUD system, responsible for managing the overall HUD stage, including HUD actions, player interactions, and the visual representation of the stage. Extending $GG.ui.visual.type.stages.base
, it inherits its properties and methods while adding functionalities specific to the HUD environment.
The primary purpose of $GG.ui.hud.types.hudBase
is to organize and manage the display and behavior of various HUD actions (such as images, videos, and text) and player interactions. It ensures that these elements are rendered correctly, timed accurately, and interact seamlessly to provide a cohesive user experience.
The HUD is essentially a modal window composed of three divs at 100% width and height, layered on top of each other:
2D Canvas
- Purpose: The 2D canvas is used for rendering traditional 2D graphics such as HUD elements, overlays, and simple animations. This is the canvas context passed into the
onDraw
event handler. - Functionality: This canvas handles all 2D drawing operations, utilizing the Canvas 2D API. It is ideal for displaying static images, text, and basic shapes that make up the HUD interface.
- Implementation: The canvas context can be retrieved with
$GG.ui.visual.activeStage.getContext
, and is automatically drawn upon with theonDraw(context)
event handler in all stage types and stage props instantiated and actively displayed.
- Purpose: The 2D canvas is used for rendering traditional 2D graphics such as HUD elements, overlays, and simple animations. This is the canvas context passed into the
3D Canvas
- Purpose: The 3D canvas, powered by Three.js, is used for rendering complex 3D graphics and animations. This includes 3D models, interactive scenes, and advanced visual effects.
- Functionality: Utilizing the WebGL context, the 3D canvas enables immersive 3D experiences that can be seamlessly integrated into the HUD. It allows for high-performance rendering of 3D content, leveraging the capabilities of the GPU. The canvas context can be retrieved with
$GG.ui.visual.activeStage.getThreeBackgroundContext()
, and the renderer with$GG.ui.visual.activeStage.getThreeRenderer()
.
HTML Layer
- Purpose: The HTML layer is used for embedding web-based content such as iframes, forms, and other interactive HTML elements. This enables the display of external web content, such as YouTube videos or web-based forms, within the HUD.
- Functionality: The HTML layer is at the bottom of the canvas stack. It supports all standard HTML and CSS features, making it highly versatile. The element reference can be retrieved with
$GG.ui.visual.activeStage.getHtmlContainer()
. If you need it to intercept user actions, you need to disable event interception of the 2D Canvas.
By extending $GG.ui.visual.type.stages.base
, $GG.ui.hud.types.hudBase
inherits its methods and properties, enabling it to function as a stage within the broader HUD system. This also means it can host its own props and provides the same capabilities as any other stage within the $GG.ui.visual
framework.
Why Use This HUD?
You might wonder, "Why not just use my own custom solution instead of extending this one?" The answer lies in the specialized functionalities of this HUD. It is designed to intercept photo sharing events from player game controllers and facilitate the joining of new games by displaying a QR code. This makes it a powerful tool for enhancing player interaction and seamless game integration. Moreover, it furthers our goal to provide a common interface for players to join a Gig Game, ensuring consistency and ease of use across the platform.
Constructor
The constructor of the $GG.ui.hud.types.hudBase
initializes the stage with a name and a set of properties. It sets up the basic environment for the HUD stage and prepares it for action management and rendering.
Constructor: constructor(name, properties = {}, extended = false)
- Parameters:
name
: The name of the stage.properties
: Additional properties for initializing the stage.extended
: A boolean indicating whether the stage is an extension of another stage.
Key Features and Methods
Properties
_action
: The current action being executed on the stage._actionQueue
: A queue of actions to be executed sequentially._actionQueueDelay
: The delay between executing queued actions._actionStartTime
: The start time of the current action._actionElapsedTime
: The elapsed time since the current action started._actionState
: The state of the current action._showPlayerImages
: A boolean indicating if player images should be displayed._qrCode
: The QR code url._showQrCode
: A boolean indicating if the QR code should be shown._qrImageLoaded
: A boolean indicating if the QR code image has been loaded._videoLoaded
: A boolean indicating if the background video is loaded.
Methods
queue(hudAction)
- Description: Adds a HUD action to the action queue for sequential execution.
- Parameters:
hudAction
: The HUD action to be queued.
queueImage(path, duration, properties = )
- Description: Queues an image action to be displayed on the stage.
- Parameters:
path
: Path to the image file.duration
: Duration for which the image will be displayed.properties
: Additional properties for the image action.
queueImageData(data, duration, properties = )
- Description: Queues an image action with direct image data.
- Parameters:
data
: The image data.duration
: Duration for which the image will be displayed.properties
: Additional properties for the image action.
queueVideo(path, properties = )
- Description: Queues a video action to be displayed on the stage.
- Parameters:
path
: Path to the video file.properties
: Additional properties for the video action.
queueText(text, duration = 3000, transition = "fade", properties = )
- Description: Queues a text action to be displayed on the stage.
- Parameters:
text
: The text content to be displayed.duration
: Duration for which the text will be displayed.transition
: The type of transition effect (e.g., fade).properties
: Additional properties for the text action.
Optinal Event Handlers
- onInit(properties): Can be overridden to provide custom initialization logic when the stage is created.
- onTick(timeStamp, deltaTime): Manages updates for the stage and its actions on each tick, handling timing and state transitions.
- onDraw(context): Handles the rendering logic for the stage, drawing elements on the 2D canvas.
- onStageActive(): Logic that executes when the stage becomes active.
- onStageInactive(): Logic that executes when the stage becomes inactive.
HUD Actions
The HudAction
class, located at $GG.ui.hud.types.actions.base
and inheriting from $GG.ui.visual.type.props.base
, serves as the foundational base class for all HUD actions. It provides the core functionality and structure that other specific HUD actions build upon and is designed to be extended for the creation of new effects.
Essentially, it is a stage prop like any other, but with one unique characteristic - it is temporary. Hud actions are designed to perform a task temporarily and then disappear. They transition through states of Pending, Active, and Complete, raising events at each state change. Examples include displaying text across the screen, showing images, or playing short videos.
Additionally, these actions can be queued on a HUD stage, allowing for sequential display and interaction.
Constructor:
- Signature:
constructor(properties = {})
Properties:
state
: Represents the current state of the HUD action. Set with$GG.ui.hud.actions.state
, An enumeration that defines the possible states (Pending, Active, Complete).
Methods:
start()
: Initializes and starts the HUD action, transitioning its state from pending to active. This is called by the default hud queue.
Event Handlers:
onInit(properties)
: Can be overridden to provide custom initialization logic when the action is created.onDraw(context)
: Custom drawing logic can be implemented here to define how the action is rendered on the screen.onResize(w, h)
: This method is invoked when the HUD needs to resize, allowing the action to adjust its dimensions accordingly.onStart()
: Executed when the action starts, allowing for initialization specific to the start event.onStageActive()
: Logic that executes when the stage becomes active.onStageInactive()
: Logic that executes when the stage becomes inactive.- This class extends stage props, and thus will inherit all functionality of that class as well.
Prebuilt Actions
Building on the foundational structure of the base class, some HUD actions have been provided to deliver various types of multimedia content. These pre-built actions can be easily utilized and are managed through the queue methods of the default hudBase
class described earlier.
$GG.ui.hud.types.actions.imageAction
- Purpose: Displays an image on the HUD for a specified duration.
- Key Properties:
image
,imageAngle
,imageStartTime
,imageElapsedTime
,imageFadeInDuration
,imageDisplayDuration
,imageFadeOutDuration
,imagePauseDuration
,imageState
. - Methods:
onTick()
,onDraw()
,drawImage()
.
$GG.ui.hud.types.actions.videoAction
- Purpose: Displays a video on the HUD.
- Key Properties:
videoLoaded
,videoPath
,muted
. - Methods:
onTick()
,onDraw()
,drawVideo()
.
$GG.ui.hud.types.actions.textAction
- Purpose: Displays scrolling text on the HUD.
- Key Properties:
text
,minDuration
,transitionType
,textState
,textStartTime
,textElapsedTime
,textLeadInDuration
,textDisplayDuration
,textLeadOutDuration
,textPauseDuration
,pixelsPerTime
,verticalScroll
,backgroundColor
,borderColor
,textColor
,textAlign
,fontSize
,font
,lineSpacing
,cached
,cachedValues
,preDisplayDuration
,postDisplayDuration
. - Methods:
onTick()
,onDraw()
,cacheValues()
,drawFade()
,drawSlide()
,drawExpand()
,drawZoom()
,drawSlideThrough()
,wrapText()
.
Example Extending the HUD
Here is an example of extending the HUD, taken directly from our live PhotoBomb application. The extended actions allow for temporary text with an author's name to be displayed on the screen. Additionally, it enables reactions, such as smileys and frowns, each making a sound as they move up and down across the HUD.
To give some background context to this example, the PhotoBomb app is designed to enhance user interactivity at parties by allowing users to display looping background MP4 videos from a vast library. Guests can scan the on-screen QR code to send selfies, messages, or emoji reactions to the display, creating a lively and engaging atmosphere. It's live right now on the Gig Game service. You can see it working!
The example below shows a slimmed-down version of our in-app Host Console, which simply opens the HUD and communicates with the controllers. Notably, our $GG library was developed to run this code against the sandbox servers automatically when executed outside the Gig Game environment on your local workstation's local host web server, allowing developers to test and debug the application in a controlled environment before deploying it to the live Gig Game platform. When deployed live, our code communicates with the game launch service to configure itself for production. Yeah, it's that automated.
Example Structure:
/project-root
├── /console
│ └── index.html
└── /controller
└── index.html
The Console
console/index.html
<!DOCTYPE html>
<html>
<head>
<title>PhotoBomb</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<!-- gg-gamekey is assigned in Gig Game HUD when you create an app to develop -->
<!-- gg-hostkey is assigned in Gig Game HUD when you create an app to develop -->
<body
gg-gamekey="{Key Assigned on App Creation}"
gg-hostkey="{Key Assigned on App Creation}"
gg-language="en">
<div>This example will open the hud on load using sandbox environment!</div>
<!-- GigGame Client Server Core Framework -->
<script src="https://launch.gig.game/api/js?key={ Your API Key }&libraries=ui"></script>
<!-- JS Game Engine -->
<script src="./console.js" type="module"></script>
</body>
</html>
console/console.js
This script defines the logic for managing HUD interactions and player communications in the PhotoBomb application. It imports the PhotoBombHud
and PhotoBombHudAction
classes and sets up a Console
class that handles server events and updates the HUD accordingly.
The Console
class initializes by binding socket events to handle incoming messages and reactions from players. When a message is received, it creates a PhotoBombHudAction
to display the message temporarily on the HUD, ensuring it appears for an appropriate duration based on the message length. Reactions are also displayed on the HUD, with visual feedback provided to users.
Additionally, the Console
class includes methods to manage the HUD lifecycle, ensuring it opens and closes properly. The openHud
method initializes the HUD and plays a specified video. The script listens for the server's readiness and loads necessary assets before initializing the Console
class and opening the HUD with a predefined video. This setup enables real-time interaction and dynamic content display during events or parties, enhancing the user experience with multimedia and interactive elements.
import { PhotoBombHud } from "./photobombhud.js";
import { PhotoBombHudAction } from "./photobombhudaction.js";
export class Console {
constructor() {
this.bindSocketEvents();
// If the console unloads, shut down the HUD too
window.addEventListener("beforeunload", function () {
if (window.hud && !window.hud.closed) {
window.hud.close();
}
});
}
bindSocketEvents() {
// Attach event listeners for server events
$GG.server.attachEvent("SendMessage", this.sendMessageHandler.bind(this));
$GG.server.attachEvent("SendReaction", this.sendReactionHandler.bind(this));
}
sendMessageHandler(gs, o, p) {
// Send the message to the HUD
this.sendMessageToHud(gs.message || "", p.playerName, "message");
}
sendReactionHandler(gs, o, p) {
// Get the current stage and show the reaction
var stage = $GG.ui.hud.launcher.getStage();
stage.instance.showReaction(gs.reaction);
}
async sendMessageToHud(message, name, imageName = null) {
// Calculate the display duration based on word count
let wordsCount = message.trim().split(/\s+/).length;
let seconds = Math.max(3, Math.min(wordsCount, 20));
// Get the current stage
var stage = $GG.ui.hud.launcher.getStage();
if (stage != null) {
// Create a new HUD action with the message
var action = new PhotoBombHudAction(
message,
name,
seconds * 1000,
"slide-through-right",
{
textColor: "#FFFFFF",
backgroundColor: "rgba(0,0,0,0.5)",
borderColor: "#FFFFFF",
iconName: imageName,
}
);
// Play sound effects on action lead-in and lead-out
action.on("leadIn", () => {
$GG.ui.audio.play("swoosh", false, 0.5);
});
action.on("leadOut", () => {
$GG.ui.audio.play("swoosh", false, 0.5);
});
// Queue the action on the stage
var prop = stage.instance.queue(action);
}
}
async openHud(path = null) {
// Get or create the photobomb HUD stage
var stage = $GG.ui.visual.getStage("photobomb");
if (!stage) {
stage = new PhotoBombHud("photobomb");
} else {
stage = stage.instance;
}
// Open the HUD and play the specified video
$GG.ui.hud.launcher.open("photobomb").then(() => {
stage.openMP4(path, this.muted === "yes");
});
}
}
$(document).ready(function () {
// Initialize the application once the server is ready
$GG.server.attachEvent("OnReady", () => {
$GG.ui.assets.loadAssetFile("./assets.json").then((r) => {
var app = new Console();
// Open the HUD with a specified video
app.openHud("/mypath/myvideo.mp4");
});
});
});
console/photobombhud.js
The PhotoBombHud
class is an extension of the base HUD class $GG.ui.hud.types.hudBase
within the Gig Game framework. This class is designed to enhance the standard HUD functionalities by integrating custom features such as reactions, sound effects, and the ability to embed and control YouTube and MP4 video content directly within the HUD. It provides a practical example of how developers can customize the HUD to create a more interactive and engaging user experience.
import { Reaction } from "./reaction.js";
/**
* Class representing the PhotoBomb HUD (Heads-Up Display).
* Inherits from $GG.ui.hud.types.hudBase.
* Example of custom hud. We are extending to support youtube
* and turn off pointer events on 2d context.
*/
export class PhotoBombHud extends $GG.ui.hud.types.hudBase {
constructor(name, properties = {}) {
super(name, properties);
this.reactions = [];
}
/**
* Method called when the HUD stage becomes active.
* Disables pointer events on the HUD canvas element.
*/
onStageActive() {}
/**
* Method called to draw on the HUD context.
* Currently, it does not perform any drawing operations.
* @param {CanvasRenderingContext2D} context - The canvas rendering context.
*/
//onDraw(context) {} we dont need this for our purposes. we are not drawing with the stage directly
/**
* Method called to draw reaction on screen
* @param {string} reaction
*/
showReaction(reaction) {
var prop = new Reaction(reaction, 3000);
prop.on("started", () => {
const max = reaction === "happy" ? 4 : reaction === "funny" ? 2 : 1;
const random = Math.floor(Math.random() * max) + 1;
const audioFile = `${reaction}-${random}`;
$GG.ui.audio.play(audioFile, false, 1, true);
});
prop.on("completed", () => {
this.destroyProp(prop.id);
});
this.registerProp(prop);
}
/**
* Overriding queuImage on base so I can add a sound effect
* @param {any} path
* @param {any} duration
* @param {any} properties
* @returns
*/
queueImage(path, duration, properties = {}) {
const prop = new $GG.ui.hud.types.actions.imageAction(
path,
null,
duration,
properties
);
this.queue(prop);
prop.on("started", () => {
$GG.ui.audio.play("camera", false, 0.5);
});
return prop;
}
/**
* Opens a MP4 video in the HUD.
* @param {any} path
* @param {any} muted
*/
openMP4(path, muted) {
let hudCanvas = window.hud.document.getElementById("1spark2D");
if (hudCanvas) {
// Disable pointer events on the canvas to prevent interactions
hudCanvas.style.pointerEvents = muted ? "auto" : "none";
} else {
console.log("Canvas element not found in child window.");
}
// Get the HTML container of the active stage
const videoContainer = $GG.ui.visual.activeStage.getHtmlContainer();
videoContainer.innerHTML = "";
// Create a video element
const videoElement = document.createElement("video");
// Set video element attributes
videoElement.src = path;
videoElement.muted = muted;
videoElement.controls = true; // Adds playback controls to the video
videoElement.autoplay = true; // Automatically starts playback
videoElement.loop = true;
// Apply CSS styles to make the video cover the container
videoElement.style.position = "absolute";
videoElement.style.top = "50%";
videoElement.style.left = "50%";
videoElement.style.width = "100%";
videoElement.style.height = "100%";
videoElement.style.objectFit = "cover";
videoElement.style.transform = "translate(-50%, -50%)";
// Apply CSS styles to the container
videoContainer.style.position = "relative";
videoContainer.style.width = "100%";
videoContainer.style.height = "100%";
videoContainer.style.overflow = "hidden";
// Append the video element to the video container
videoContainer.appendChild(videoElement);
}
}
console/photobombhudaction.js
The PhotoBombHudAction
class is used to display temporary messages and animations on the screen within the PhotoBomb application. It manages how text and other visual elements appear, move, and disappear in a smooth, visually appealing way. This class handles different stages of the message's life cycle, including how it enters, is displayed, and exits the screen. It also allows customization of the text's appearance, such as font, color, and background, and can include additional elements like icons. Essentially, it's designed to create dynamic, engaging text-based animations for enhancing user interaction during events or parties.
export class PhotoBombHudAction extends $GG.ui.hud.types.actions.base {
constructor(
text,
name,
minDuration = 3000,
transitionType = "slide-through-right",
properties = {}
) {
super(properties);
this.text = text; // The text to be displayed
this.name = name; // The author's name
this.minDuration = minDuration; // Minimum duration for the display
this.transitionType = transitionType; // Type of transition effect
this.textState = "ready"; // Initial state of the text
this.textStartTime = 0; // Start time of the text display
this.textElapsedTime = 0; // Elapsed time since the text started
this.textLeadInDuration = 500; // Duration of the lead-in transition
this.textDisplayDuration = minDuration; // Duration of the text display
this.textLeadOutDuration = 500; // Duration of the lead-out transition
this.textPauseDuration = 1000; // Duration of the pause between transitions
this.pixelsPerTime = 0; // Pixels per unit time for scrolling
this.verticalScroll = 0; // Vertical scroll amount
this.backgroundColor =
properties.backgroundColor || "rgba(0, 0, 0, 0.8)"; // Background color of the text box
this.borderColor = properties.borderColor || "#0b9047"; // Border color of the text box
this.textColor = properties.textColor || "white"; // Text color
this.textAlign = properties.textAlign || "left"; // Text alignment
this.lineSpacing =
properties.lineSpacing == undefined ? 2 : properties.lineSpacing; // Line spacing for text
this.cached = false; // Whether the values are cached
this.cachedValues = {}; // Cached values for rendering
// Configurable durations for pre-display and post-display
this.preDisplayDuration = 1000;
this.postDisplayDuration = 1000;
if (properties.iconName != null)
this.iconImage = $GG.ui.assets.getImage(properties.iconName); // Icon image for the text box
}
// GigGame Stage Tick Event Handler
onTick(timeStamp, deltaTime) {
if (this.state === $GG.ui.hud.types.actions.state.Active) {
this.updateTextState(timeStamp); // Update the state of the text
}
}
// GigGame Stage Draw Event Handler
onDraw(context) {
context.clearRect(0, 0, this.width, this.height); // Clear the canvas
if (!this.cached) {
this.cacheValues(context); // Cache values for rendering
this.cached = true;
}
if (this.state == $GG.ui.hud.types.actions.state.Active) {
switch (this.transitionType) {
case "slide-through-left":
this.drawSlide(context, "left"); // Draw slide-in from the left
break;
case "slide-through-right":
this.drawSlide(context, "right"); // Draw slide-in from the right
break;
}
}
}
updateTextState(timeStamp) {
switch (this.textState) {
case "ready":
this.startLeadIn(timeStamp); // Start lead-in transition
break;
case "leadIn":
this.handleLeadIn(timeStamp); // Handle lead-in transition
break;
case "pre-display":
this.handlePreDisplay(timeStamp); // Handle pre-display state
break;
case "display":
this.handleDisplay(timeStamp); // Handle display state
break;
case "post-display":
this.handlePostDisplay(timeStamp); // Handle post-display state
break;
case "leadOut":
this.handleLeadOut(timeStamp); // Handle lead-out transition
break;
}
}
startLeadIn(timeStamp) {
this.textState = "leadIn"; // Set state to lead-in
this.textStartTime = timeStamp; // Set start time
this.textElapsedTime = 0; // Reset elapsed time
this.trigger(this.textState); // Trigger state change event
}
handleLeadIn(timeStamp) {
this.textElapsedTime = timeStamp - this.textStartTime; // Calculate elapsed time
if (this.textElapsedTime >= this.textLeadInDuration) {
this.textState =
this.pixelsPerTime === 0 ? "display" : "pre-display"; // Move to next state
this.trigger(this.textState); // Trigger state change event
this.resetTime(timeStamp); // Reset time
}
}
handlePreDisplay(timeStamp) {
this.textElapsedTime = timeStamp - this.textStartTime; // Calculate elapsed time
if (this.textElapsedTime >= this.preDisplayDuration) {
this.textState = "display"; // Move to display state
this.resetTime(timeStamp); // Reset time
}
}
handleDisplay(timeStamp) {
this.textElapsedTime = timeStamp - this.textStartTime; // Calculate elapsed time
this.verticalScroll = this.pixelsPerTime * this.textElapsedTime; // Update vertical scroll amount
if (this.textElapsedTime >= this.textDisplayDuration) {
this.textState =
this.pixelsPerTime === 0 ? "leadOut" : "post-display"; // Move to next state
this.trigger(this.textState); // Trigger state change event
this.resetTime(timeStamp); // Reset time
}
}
handlePostDisplay(timeStamp) {
this.textElapsedTime = timeStamp - this.textStartTime; // Calculate elapsed time
if (this.textElapsedTime >= this.postDisplayDuration) {
this.textState = "leadOut"; // Move to lead-out state
this.trigger(this.textState); // Trigger state change event
this.resetTime(timeStamp); // Reset time
}
}
handleLeadOut(timeStamp) {
this.textElapsedTime = timeStamp - this.textStartTime; // Calculate elapsed time
if (this.textElapsedTime >= this.textLeadOutDuration) {
this.textState = "done"; // Set state to done
this.state = $GG.ui.hud.types.actions.state.Complete; // Set overall state to complete
this.trigger(this.textState); // Trigger state change event
this.resetTime(timeStamp); // Reset time
}
}
resetTime(timeStamp) {
this.textStartTime = timeStamp; // Set start time
this.textElapsedTime = 0; // Reset elapsed time
}
cacheValues(context) {
try {
const canvas = context.canvas;
const { width: containerWidth, height: containerHeight } = canvas;
const iconSize = containerHeight * 0.1; // Size of the icon
let fontSize = containerHeight * 0.1; // Main font size
let smallFontSize = containerHeight * 0.03; // Font size for author's name
const font = `${fontSize}px arial`; // Font for text
const nameFont = `${smallFontSize}px arial`; // Font for author's name
const borderWidth = 5; // Width of the border
const padding = 20; // Padding inside the text box
context.font = font;
const measure1 = context.measureText(this.text); // Measure text width
context.font = nameFont;
const measure2 = context.measureText(this.name); // Measure author's name width
const lineHeight =
measure1.fontBoundingBoxAscent +
measure1.fontBoundingBoxDescent +
this.lineSpacing; // Line height for text
const signatureHeight =
measure2.fontBoundingBoxAscent +
measure2.fontBoundingBoxDescent +
this.lineSpacing; // Height of the author's name
console.log("signatureHeight", measure1, measure2, signatureHeight);
const textWidth =
measure1.width > measure2.width
? measure1.width
: measure2.width; // Determine the width of the text box
const unwrapedFontBasedWidth = textWidth + padding * 2;
const maxBoxWidth = containerWidth * 0.8; // Maximum width of the text box
const maxBoxHeight = containerHeight * 0.8; // Maximum height of the text box
let textBoxWidth = Math.min(
maxBoxWidth * 0.8,
unwrapedFontBasedWidth + padding * 2
);
// Wrap text using the calculated font size
context.font = font;
let lines = this.wrapText(context, this.text, textBoxWidth - 39);
const maxLines = (
(maxBoxHeight - padding * 2 - signatureHeight) /
lineHeight
).toFixed(0);
if (lines.length > maxLines) {
// Truncate lines to the maximum allowed
lines = lines.slice(0, maxLines);
// Add ellipsis to the last line
lines[maxLines - 1] = lines[maxLines - 1] + "...";
}
let textBoxHeight =
lineHeight * lines.length + padding * 2
+ signatureHeight;
// Calculate the position of the text box (centered)
const x = (containerWidth - textBoxWidth) / 2;
const y = (containerHeight - textBoxHeight) / 2;
const fontX =
this.textAlign == "center"
? containerWidth / 2
: x + borderWidth / 2 + padding;
const fontY = y + borderWidth / 2 + padding;
const nameX = x + textBoxWidth - borderWidth / 2 - padding;
const nameY =
y + textBoxHeight - borderWidth / 2 - padding - signatureHeight;
// Cache the calculated values
this.cachedValues = {
x,
y,
textBoxWidth,
textBoxHeight,
lineHeight,
lines,
font: font,
fontX: fontX,
fontY: fontY,
borderWidth,
nameX,
nameY,
nameFont,
iconSize,
};
} catch (error) {
console.error("Error caching values: ", error);
}
}
drawSlide(context, dir) {
const {
x,
y,
textBoxWidth,
textBoxHeight,
lineHeight,
lines,
font,
fontX: originalFontX,
fontY: originalFontY,
borderWidth,
nameX,
nameY,
nameFont,
iconSize,
} = this.cachedValues;
context.save(); // Save the current context state
let progress = this.getProgress(); // Get the progress of the transition
if (dir == "right") progress = progress * -1;
const containerWidth = context.canvas.width;
const offsetX = (containerWidth / 2) * progress; // Calculate the horizontal offset for the slide effect
context.translate(offsetX, 0); // Apply the translation for the slide effect
context.fillStyle = this.backgroundColor;
//context.fillRect(x, y, textBoxWidth, textBoxHeight);
// Begin path for rounded rectangle
context.beginPath();
const radius = 10; // Change this to whatever radius you want for the corners
context.moveTo(x + radius, y);
context.lineTo(x + textBoxWidth - radius, y);
context.arcTo(
x + textBoxWidth,
y,
x + textBoxWidth,
y + radius,
radius
);
context.lineTo(x + textBoxWidth, y + textBoxHeight - radius);
context.arcTo(
x + textBoxWidth,
y + textBoxHeight,
x + textBoxWidth - radius,
y + textBoxHeight,
radius
);
context.lineTo(x + radius, y + textBoxHeight);
context.arcTo(
x,
y + textBoxHeight,
x,
y + textBoxHeight - radius,
radius
);
context.lineTo(x, y + radius);
context.arcTo(x, y, x + radius, y, radius);
context.closePath();
// Fill the rounded rectangle
context.fill();
// Stroke the rounded rectangle
context.lineWidth = borderWidth;
context.strokeStyle = this.borderColor;
context.stroke();
if (this.iconImage != null) {
const ix = x - iconSize * 0.6;
const iy = y - iconSize * 0.6;
const iconWidth = iconSize;
const iconHeight = iconSize;
context.drawImage(
this.iconImage.ActiveStage,
ix,
iy,
iconWidth,
iconHeight
);
}
// Clip to the rounded rectangle
context.clip();
context.fillStyle = this.textColor;
context.font = font;
context.textAlign = "left";
context.textBaseline = "top";
const fontY = originalFontY - this.verticalScroll; // Adjust the vertical position for scrolling
lines.forEach((line, i) => {
context.fillText(line, originalFontX, fontY + i * lineHeight); // Draw each line of text
});
var playerName = this.name;
context.font = nameFont;
context.textAlign = "right";
context.textBaseline = "top";
context.fillText("~ " + playerName, nameX, nameY); // Draw the author's name
context.restore(); // Restore the context to its original state
}
getProgress() {
let progress = 0;
switch (this.textState) {
case "leadIn":
progress =
(1 - this.textElapsedTime / this.textLeadInDuration) * -1.5;
break;
case "pre-display":
progress = 0;
break;
case "display":
progress = 0;
break;
case "post-display":
progress = 0;
break;
case "leadOut":
progress =
(this.textElapsedTime / this.textLeadInDuration) * 1.5;
break;
default:
progress =
(this.textElapsedTime / this.textLeadInDuration) * 1.5;
break;
}
return progress; // Return the progress of the transition
}
wrapText(context, text, maxWidth) {
const words = text.split(" "); // Split the text into words
let lines = [];
let currentLine = words[0];
for (let i = 1; i < words.length; i++) {
const word = words[i];
const width = context.measureText(currentLine + " " + word).width;
if (width < maxWidth) {
currentLine += " " + word; // Add the word to the current line
} else {
lines.push(currentLine); // Start a new line
currentLine = word;
}
}
lines.push(currentLine); // Add the last line
return lines; // Return the array of lines
}
}
This class PhotoBombHudAction
is designed to handle HUD actions, particularly those involving text display with various transitions. Below are the inline comments explaining the code:
- Constructor: Initializes the HUD action with text, author's name, duration, transition type, and other properties.
- onTick: Called on each tick event to update the state of the text.
- onDraw: Called on each draw event to render the text and background on the canvas.
- updateTextState: Manages the state transitions for the text display.
- startLeadIn, handleLeadIn, handlePreDisplay, handleDisplay, handlePostDisplay, handleLeadOut: Handle the different states of the text display.
- resetTime: Resets the timing for state transitions.
- cacheValues: Caches the calculated values for rendering the text box.
- drawSlide: Draws the text box with a slide transition.
- getProgress: Calculates the progress of the current transition.
- wrapText: Wraps the text within the specified maximum width.
console/reaction.js
The reaction.js
file defines a Reaction
class used to animate visual reactions, like smileys or frowns, on the screen. These reactions appear, move across the screen, and then disappear. The class handles the setup, animation, and rendering of these reactions, ensuring they smoothly transition from one state to another. This functionality is used to display engaging, temporary visual feedback to users during events or interactions within the application.
export class Reaction extends $GG.ui.visual.type.props.base {
constructor(reaction, duration, properties = {}) {
super(properties, true);
this.reaction = reaction; // The type of reaction (e.g., smiley, frown)
this.duration = duration; // Duration of the reaction animation
this.startTimeStamp = null; // Timestamp when the animation starts
this.state = "pending"; // Initial state of the reaction
if (this.onInit) this.onInit(properties); // Initialize properties if onInit is defined
}
// Method called before initialization (can be used for custom setup)
onPreInit(properties) {}
// Method called during initialization
onInit(properties) {
this.image = $GG.ui.assets.getImage(this.reaction); // Get the image for the reaction
this.resize(this.image.Image.width, this.image.Image.height); // Resize based on the image dimensions
this.scale = 1; // Initial scale of the image
this.orientation.xScale = this.scale; // Set x scale
this.orientation.yScale = this.scale; // Set y scale
}
// Method called when the reaction is added to the stage
onStage() {
this.setSize(this._parent.width, this._parent.height); // Set the size based on the parent dimensions
// Start on the right with y at a random position from top to bottom
let minY = this._stage.width * 0.05;
let maxY = this._stage.height - minY;
let randomY = Math.floor(Math.random() * (maxY - minY)) + minY;
this.orientation.x = this._parent.width; // Set x position to the right edge
this.orientation.y = randomY; // Set y position to a random value
this.originalY = randomY; // Store the original y position
this.trigger("started"); // Trigger the started event
this.state = "active"; // Set state to active
}
// Method called when the stage is resized
onResize(w, h) {
this.setSize(w, h); // Adjust size based on new dimensions
}
// Clock tick event handler
onTick(timeStamp, deltaTime) {
if ((this.state == "pending") | (this.state == "completed")) return; // Ignore if not active
if (this.startTimeStamp == null) this.startTimeStamp = timeStamp; // Set start timestamp if not set
const progress = this.progress(timeStamp); // Calculate progress of the animation
if (progress >= 1) {
this.state = "completed"; // Mark as completed
this.trigger("completed"); // Trigger the completed event
return;
}
this.orientation.x = this._parent.width - this._parent.width * progress; // Update x position based on progress
// Add a sinusoidal variation to the y position
let maxYVariance = this._parent.height * 0.05;
this.orientation.y =
this.originalY + maxYVariance * Math.sin(progress * 4 * Math.PI);
}
// Draw event handler
onDraw(context) {
if ((this.state == "pending") | (this.state == "completed")) return; // Ignore if not active
const imageCanvas = this.image.ActiveStage; // Get the image for the active stage
context.drawImage(imageCanvas, 0, 0); // Draw the image on the canvas
}
// Calculate the progress of the animation
progress(timeStamp) {
return (timeStamp - this.startTimeStamp) / this.duration;
}
// Set the size and scale of the reaction based on the dimensions
setSize(w, h) {
this.scale = (w * 0.05) / this.width; // Calculate scale based on width
this.orientation.xScale = this.scale; // Set x scale
this.orientation.yScale = this.scale; // Set y scale
}
}
This class Reaction
is designed to handle visual reactions in the HUD. Below are the inline comments explaining the code:
- Constructor: Initializes the reaction with a specific type, duration, and additional properties.
- onPreInit: Method for pre-initialization tasks (customizable).
- onInit: Sets up the reaction image and initial properties.
- onStage: Called when the reaction is added to the stage, setting initial positions and sizes.
- onResize: Adjusts the size of the reaction when the stage is resized.
- onTick: Handles the animation progress on each tick.
- onDraw: Renders the reaction on the canvas.
- progress: Calculates the progress of the animation based on the elapsed time.
- setSize: Sets the size and scale of the reaction based on the dimensions of the stage.
console/assets.json
This is the asset file loaded.
{
"audio": [
{
"name": "camera",
"path": "assets/audio/camera.mp3" // Sound effect for camera click
},
{
"name": "pop",
"path": "assets/audio/pop.mp3" // Sound effect for pop
},
{
"name": "swoosh",
"path": "assets/audio/swoosh.mp3" // Sound effect for swoosh
},
{
"name": "happy-1",
"path": "assets/audio/happy-1.mp3" // Happy sound effect variation 1
},
{
"name": "happy-2",
"path": "assets/audio/happy-2.mp3" // Happy sound effect variation 2
},
{
"name": "happy-3",
"path": "assets/audio/happy-3.mp3" // Happy sound effect variation 3
},
{
"name": "happy-4",
"path": "assets/audio/happy-4.mp3" // Happy sound effect variation 4
},
{
"name": "funny-1",
"path": "assets/audio/funny-1.mp3" // Funny sound effect variation 1
},
{
"name": "funny-2",
"path": "assets/audio/funny-2.mp3" // Funny sound effect variation 2
},
{
"name": "angry-1",
"path": "assets/audio/angry-1.mp3" // Angry sound effect variation 1
},
{
"name": "love-1",
"path": "assets/audio/love-1.mp3" // Love sound effect variation 1
},
{
"name": "sad-1",
"path": "assets/audio/sad-1.mp3" // Sad sound effect variation 1
},
{
"name": "suprised-1",
"path": "assets/audio/suprised-1.mp3" // Surprised sound effect variation 1
}
],
"videos": [],
"images": [
{
"name": "message",
"path": "assets/images/message.png" // Image for message display
},
{
"name": "happy",
"path": "assets/images/happy.png" // Image for happy reaction
},
{
"name": "angry",
"path": "assets/images/angry.png" // Image for angry reaction
},
{
"name": "funny",
"path": "assets/images/funny.png" // Image for funny reaction
},
{
"name": "love",
"path": "assets/images/love.png" // Image for love reaction
},
{
"name": "sad",
"path": "assets/images/sad.png" // Image for sad reaction
},
{
"name": "suprised",
"path": "assets/images/suprised.png" // Image for surprised reaction
}
],
"layouts": [],
"data": []
}
The Controller
The provided code for the PhotoBomb controller sets up a user interface that allows players to interact with the game in various ways. The index.html
file defines the layout of the interface, including sections for capturing images, sending messages, and reacting with different emotions. It includes buttons for each emotion, a message input area with character count display, and a hidden file input for image capture.
The controller.js
file contains the JavaScript functionality for the controller. It initializes the interface, sets up event handlers for server connections and errors, and manages user interactions such as sending messages and capturing images. The script also handles broadcasting user reactions to the host and ensures a smooth user experience by playing audio feedback and providing real-time updates on the connection status.
Additionally, this code demonstrates the built-in globalization capabilities of the $GG framework. The domTranslate
method and the inclusion of language settings in the HTML body tag show how the interface can dynamically adapt to different languages, making the application accessible to a broader audience. This setup enables users to actively participate in the game by sending messages, reacting to events, and sharing images, all while enjoying a localized and interactive experience in PhotoBomb.
controller/index.html
<!DOCTYPE html>
<html>
<head>
<title>PhotoBomb</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link href="lib/toastr/2.1.4/toastr.min.css" rel="stylesheet" />
<link rel="stylesheet" href="css/site.css" asp-append-version="true" />
</head>
<!-- gg-gamekey is assigned in Gig Game HUD when you create an app to develop -->
<!-- gg-playerkey is something you create to test with. 01 = player 1, 02 = player 2, ect -->
<body
gg-gamekey="{Key Assigned on App Creation}"
gg-playerkey="00000000-0000-0000-0000-000000000001"
gg-language="en">
<div class="controller">
<div class="header">
<div class="logo">
<img src="img/logo.png" />
</div>
<div class="camera">
<button onclick="Controller.camera();">
<div>
<img src="img/camera.png" />
</div>
</button>
</div>
</div>
<div class="emotions">
<div>
<div>
<button onclick="Controller.reaction('happy');">
<div>
<img src="img/happy.png" />
</div>
</button>
</div>
<div>
<button onclick="Controller.reaction('love');">
<div>
<img src="img/love.png" />
</div>
</button>
</div>
</div>
<div>
<div>
<button onclick="Controller.reaction('funny');">
<div>
<img src="img/funny.png" />
</div>
</button>
</div>
<div>
<button onclick="Controller.reaction('suprised');">
<div>
<img src="img/suprised.png" />
</div>
</button>
</div>
</div>
<div>
<div>
<button onclick="Controller.reaction('sad');">
<div>
<img src="img/sad.png" />
</div>
</button>
</div>
<div>
<button onclick="Controller.reaction('angry');">
<div>
<img src="img/angry.png" />
</div>
</button>
</div>
</div>
</div>
<div class="message">
<div class="control">
<img src="img/message.png" />
<label><span id="char-count">125</span> <span autotranslate>chars remaining</span></label>
<textarea id="message"></textarea>
</div>
<div class="actions">
<button type="button" class="btn btn-primary w-100 mt-3" onclick="Controller.send();" autotranslate>Send Message</button>
<button type="button" class="btn btn-secondary w-100 mt-3" onclick="Controller.clear();" autotranslate>Clear</button>
</div>
</div>
</div>
<input type="file" id="fileInput" style="display:none;" accept="image/*" capture="camera">
<div id="connection">
<div class="connection-message" autotranslate>
Please wait. Attempting network connection.
</div>
</div>
<script src="lib/bootstrap/5.1.0/js/bootstrap.bundle.min.js"></script>
<script src="lib/jquery/3.5.1/jquery.min.js"></script>
<script src="lib/toastr/2.1.4/toastr.min.js"></script>
<!-- GigGame Client Server Core Framework -->
<script src="https://launch.gig.game/api/js?key={ Your API Key }&libraries=globalization,ui"></script>
<!-- JS Game Conroller Code -->
<script src="./controller.js"></script>
</body>
</html>
controller/controller.js
var Controller = {
connection: null, // Holds the connection status
lastReactionTime: 0, // Timestamp of the last reaction
init: function () {
this.domTranslate(); // Translate the DOM elements
this.initCharCount(); // Initialize character count for the message input
this.initializeCameraCapture(); // Initialize camera capture for uploading images
// Attach server event handlers
$GG.server.attachEvent("Error", this.errorHandler.bind(this));
$GG.server.attachEvent("OnConnect", this.connectHandler.bind(this));
$GG.server.attachEvent("OnConnectRestore", this.connectRestoreHandler.bind(this));
$GG.server.attachEvent("OnConnectLost", this.connectLostHandler.bind(this));
// Preload audio assets manually. I didnt bother with an asset sheet.
$GG.ui.assets.preloadAudio("pop", "audio/pop.mp3");
$GG.ui.assets.preloadAudio("camera", "audio/camera.mp3");
$GG.ui.assets.preloadAudio("error", "audio/error.mp3");
},
// Handler for successful connection
connectHandler() {
$("#connection").fadeOut(1000); // Fade out the connection status indicator
},
// Handler for restored connection
connectRestoreHandler() {
$("#connection").fadeOut(1000); // Fade out the connection status indicator
},
// Handler for lost connection
connectLostHandler() {
$("#connection").fadeIn(1000); // Fade in the connection status indicator
},
// Handler for errors
errorHandler(r) {
if (r.message) toastr.error(r.message); // Display error message
},
// Initialize character count for the message input field
initCharCount() {
const maxChars = 125;
const messageTextarea = $("#message");
const charCountDisplay = $("#char-count");
// Update character count on input
messageTextarea.on("input", function () {
let currentLength = $(this).val().length;
if (currentLength > maxChars) {
$(this).val($(this).val().substring(0, maxChars)); // Limit input to maxChars
currentLength = maxChars;
}
charCountDisplay.text(maxChars - currentLength); // Display remaining characters
});
},
// Initialize camera capture for uploading images
initializeCameraCapture() {
$("#fileInput").change(function () {
$GG.ui.audio.play("camera", false, 1); // Play camera sound
var file = this.files[0];
if (file) {
const reader = new FileReader();
reader.onloadend = function () {
const base64Data = reader.result.split(",")[1]; // Extract base64 data
// Upload the image to the server
$GG.server.uploadImage(base64Data)
.then(() => {
toastr.success("Image sent!"); // Display success message
})
.catch((error) => {
toastr.error("Error uploading the image"); // Display error message
});
};
reader.readAsDataURL(file); // Read the file as a data URL
}
});
},
// Trigger file input click for capturing image
camera() {
$("#fileInput").click();
},
// Handle reaction button click
reaction(name) {
const currentTime = Date.now();
const cooldownPeriod = 500; // 0.5 seconds in milliseconds
if (currentTime - this.lastReactionTime < cooldownPeriod) {
// Play error sound and display error message
$GG.ui.audio.play("error", false, 1);
toastr.error("You must wait a half second before reacting again.");
return;
}
// Update the last reaction time
this.lastReactionTime = currentTime;
// Broadcast reaction to host
$GG.server.broadcastToHost("SendReaction", { reaction: name });
// Play pop sound
$GG.ui.audio.play("pop", false, 1);
// Remove focus from all reaction buttons
const buttons = document.querySelectorAll(".emotions button");
buttons.forEach((button) => {
button.blur();
});
},
// Send the message to the host
send() {
var m = $("#message").val();
if (m == null || m.trim() == "") {
toastr.warning("Please enter a message first."); // Display warning if message is empty
return;
}
// Broadcast message to host
$GG.server.broadcastToHost("SendMessage", { message: m });
$("#message").val(""); // Clear the message input
},
// Clear the message input
clear() {
$("#message").val("");
},
// Translate the DOM elements
domTranslate: function () {
this.translateHandle = $GG.globalization.language.translateDom();
},
};
// Initialize the controller when the document is ready
$(document).ready(function () {
$GG.server.attachEvent("OnReady", () => {
Controller.init();
});
});