WWDC25 Metal & game technologies group lab

Hello,

Thank you for attending today’s Metal & game technologies group lab at WWDC25!

We were delighted to answer many questions from developers and energized by the community engagement. We hope you enjoyed it and welcome your feedback.

We invite you to carry on the conversation here, particularly if your question appeared in Slido and we were unable to answer it during the lab.

If your question received feedback let us know if you need clarification.

You may want to ask your question again in a different lab e.g. visionOS tomorrow. (We realize that this can be confusing when frameworks interoperate)

We have a lot to learn from each other so let’s get to Q&A and make the best of WWDC25! 😃

Looking forward to your questions posted in new threads.

If Getting into visionOS and metal for creating games - Is there any tips or optimal apis/framework/tools that you would recommend?

Easiest way to start on visionOS is with RealityKit. The GameController framework supports lots of controllers for different types of games. 200GB of Background Assets available for content rich games.

Are there standard methods of achieving synchronization of state between players (specifically using VisionOS here) or is it a roll-your-own situation?

For synchronizing objects in a Shared spatial environment, consider using TableTopKit, which is compatible with GKMatch in the GameKit framework. If you’re using RealityKit, you can use SynchronizationService to coordinate shared physically simulated objects. We also recommend the WWDC session: Share visionOS experiences with nearby people.

Which official Metal docs or sample projects best support teams moving a CPU-centric CAD kernel to GPU-accelerated workflows on Apple Silicon

The Performing Calculations on a GPU sample code shows how to use Metal to identify available GPUs and run calculations on them. Processing a Texture in a Compute Function sample code demonstrates creating textures by running copy and dispatch commands within a compute pass on the GPU.

Is there a way to automatically extract Metal performance metrics (e.g. to a CSV file) from a GPU trace for further analysis outside of the metal debugger?

You can capture a gputrace and replay it with profiling. Go to Performance and select the Xcode → Editor → Export GPU Counters option to export the performance metrics as a CSV file.

How can Game Center be used to with SwiftData apps to determine when someone has earned an award (ex. a task tracking app that wants to give out awards for completing tasks)?

GameKit provides APIs for querying the player’s earned achievements and leaderboards. You can read from those APIs and write to your own SwiftData or other persistence system to trigger rewards in your game. Generally speaking, Game Center does not directly interact with SwiftData so it’s easiest to use GameKit APIs to post scores and achievement progress in parallel with your own database or backend services.

Hey, Thank you so much for your work and this session! I have a quick question regarding the Games app. When we publish new games on the App Store, are they automatically added to the Games app as well? Is there any special integration required for this process?

If the app is set to the Games category, it will show up in the Games app automatically without extra work. If you adopt Game Center it will also appear in the games library. Most importantly, the more Game Center features the game adopts, the more the OS knows about it and the more places it can be featured in the Games app. Adopting Game Center makes your game eligible for the Top Played Games chart. This chart measures play time where play time is measured starting from when your game initializes Game Center.

Is there a way to do a local p2p network based game? I am trying to find a way to create a coop party game, which doesn’t require people to be connected to internet.

For fully offline play, use Multipeer Connectivity or the Network framework. You can use Multipeer Connectivity or Network framework to create a local P2P network-based game without requiring Internet. For a local peer-to-peer games, Game Center’s Nearby Multiplayer offers useful features but requires an Internet connection.

How development for VisionOS can benefit of Metal 4?

You can develop in Metal, as there is template code available for it. Metal 4 can help reduce the CPU overheads for encoding rendering, and VisionOS supports presenting from Metal 4 command queues.

The new Metal 4 API looks like it opens some more possibilities for threaded encoding. I also noticed there wasn’t much mention of GPU driven encoding this year. Is GPU driven encoding or CPU driven encoding the best starting point for building a performant Metal app?

Metal 3 and 4 support the same GPU-driven rendering features (ICE). Concurrent CPU encoding is typically a good starting point for improving performance through parallelism, and three threads is a good match for a standard Metal drawable pool. Metal 4 makes concurrent CPU encoding easier through explicit scheduling.

What scheduling pattern is advised to run long compute passes (e.g. topology optimization) alongside interactive rendering in one Metal app without hurting UI latency?

Use a separate queue for issuing compute long running compute dispatches so that interactive rendering can run concurrently and use Metal events for synchronizing data dependencies between command queues.

How do you think about the Games app compared to the previously removed Game Center app? What aspects and integrations do you think will keep players coming back in a way they didn't for the original app despite it also having challenges? Should we be pushing people to use the app from our games?

Game Center itself has had a series of improvements for developers and players: activity feed, widgets, integrations into the App Store, improvements to friending and the social graph, multiplayer matchmaking, and more. The new Games app brings all of these Game Center capabilities together in one place that players can easily find. It also offers deeper game discovery through search and personalized recommendations. From within your games, you can guide users into the Games app if you’d like, you also have the option to bring them into the Game Overlay so they don’t need to leave your game experience. The more you integrate features like Challenges, the more prominently your game will appear within the Games app, which can help keep players engaged in ways the original Game Center app did not.

Is there a way to configure challenges for all my existing leaderboards in bulk? Looks like this will be something most game developers will want to do. Especially, have a challenge for each level in the game.

The new GameKit bundle in Xcode is recommended as it provides UI for configuring these elements in Xcode. Under the hood, the GameKit bundle is a JSON file, which can be easily automated with a script. You can also checkout the WWDC session: Get started with Game Center for best practices in configuring challenges.

(Continued)

Are there any actual game sample projects available to download?

See Metal Sample Code Library Try Bring your advanced games to Mac, iPad, and iPhone. This tutorial covers essential technologies for implementing games on Apple platforms, including Metal rendering, audio, input, and display APIs. See Port advanced games to Apple platforms See Design advanced games for Apple platforms

Inspired by Liquid Glass, I want to add shader effects to my app in SwiftUI, but I'm concerned about performance. How do Metal shaders fit in to the SwiftUI layout/drawing system, and are there any performance pitfalls to watch out for?

SwiftUI invokes your shaders during rendering. We recommend Optimize SwiftUI performance with Instruments to understand the performance impact of Metal shaders executed during a SwiftUI rendering pass.

A newbie to Apple Developer Experience - where can I start if i want to start learning to create games (i understand it will take time)

A good place to get started is the Game Porting Toolkit. This resource covers essential technologies for implementing games on Apple platforms, including Metal rendering, audio, input, and display APIs.

Would like to ask for streaming very large assembly meshes, is it preferable to rely on argument buffers with sparse heaps, compute-driven culling, or mesh shaders to balance memory footprint and orbit/zoom fluidity?

Sparse heaps are great for reducing the memory footprint of streaming vertex buffer data and can be combined with culling techniques. If the update frequency is every frame, culling with mesh shaders is excellent for reducing the amount of memory management. Compute-driven culling has broader support, especially on older hardware.

WWDC25 Metal & game technologies group lab
 
 
Q