I'm trying to display overlay on screen by following code:
NSRect windowRect = [[NSScreen mainScreen] frame];
self.overlayWindow = [[NSWindow alloc] initWithContentRect:windowRect
styleMask:NSWindowStyleMaskBorderless
backing:NSBackingStoreBuffered
defer:NO
screen:[NSScreen mainScreen]];
[self.overlayWindow setReleasedWhenClosed:YES];
[self.overlayWindow setBackgroundColor:[NSColor colorWithCalibratedRed:0.0
green:1.0
blue:0.0
alpha:0.1]];
[self.overlayWindow setAlphaValue:1.0];
[self.overlayWindow setOpaque:NO];
[self.overlayWindow setIgnoresMouseEvents:NO];
[self.overlayWindow makeKeyAndOrderFront:nil];
self.overlayWindow.ignoresMouseEvents = YES;
self.overlayWindow.level = NSScreenSaverWindowLevel;
self.overlayWindow.collectionBehavior = NSWindowCollectionBehaviorCanJoinAllSpaces | NSWindowCollectionBehaviorCanJoinAllApplications;
But when other APP enter full screen, the overlay disappears even I set the collectionBehavior with option NSWindowCollectionBehaviorCanJoinAllApplications. Is it possible to display a overlay on top of all other APPs?
How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Our app has previously not supported dark mode and we had the "Appearance" entry in our Info.plist set to "Light".
We are now about to release an update that enables dark mode support. To enable this we have:
Added a preference to our app's settings screen that lets users choose between System, Light and Dark options.
Based on the user's preference, we set the entire app's preferred color scheme using the SwiftUI .preferedColorScheme modifier on our root view.
Removed the "Appearance" entry from our Info.plist
This is all tested and working in our local development builds. We are now testing out the app for release using an internal TestFlight build and we've run into a problem - after initially updating the app, it does not seem to detect the change to the Info.plist and the app remains in light mode even if you change the preferred colour scheme.
If you force quite the app from the app switched and re-launch it, the colour scheme preference starts working as expected.
This is going to be an issue for our users because when they update the app it is going to look like the new color scheme setting does not work. Having to ask customers to force quit the app from the app switcher is not really an acceptable workaround.
I'm not sure this is specifically tied to the app process being killed because I would expect that to happen anyway when the app is updated. I'm wondering if this is related to the system caching the UISceneSession for the app and the act of force killing it from the app switcher is what causes the cached session to be created.
Is this a known issue and is there any way to solve this?
The social engineering is also driving me mad. The foreign direct interference is also driving me mad.
I was trying to submit a student loan application and the system starts telling me illogical constraints based on the page design which previously wasn’t an issue or a problem. However is now a problem and if I don’t complete the page design work flows the funds are not released to the university advertising the masters degree services.
also there seems to be a variety of universities simultaneously all competing for the same accessibility and or student loan accessibility and it’s quite difficult to decide which one to go with and or which masters degree course to take? Which is less of a problem then the ui/ux bug of not being able to complete and or submit the entire funding application which feels more like malware targeting than actual usability conformation practices.
must I put in the funding website? And the university what about privacy?
I tried the ScreenCaptureKit sample code from Apple:
ScreenCaptureKit Sample Code
When I ran it for a while, it crash at the strange position as attached screenshot.
The value array is not empty and has value at index 0 but it crashed.
I'm trying to add an SVG image to my launch screen. The SVG image is working fine in the main storyboard also used in a UIImageView, but the launch screen remains completely black; the launch screen is set with white background, so it seems to be completely ignored. When I remove the image from the UIImageView the launch screen is shown with correct background color but of course without the whished image. I can also correctly implement text in the launch screen, the launch screen shows the text and the background color correctly. As soon as I define an image from the asset catalogue for the UIImageView in the launch screen, the launch screen is completely black not showing anything. I tried also with a simple png image-set instead of the SVG image, but still the same issue. How can I implement a SVG image in my launch screen?
demo code :
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
// Flip the coordinate system
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextTranslateCTM(context, 0, self.bounds.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
NSDictionary *attrs = @{NSFontAttributeName: [UIFont systemFontOfSize:20],
NSForegroundColorAttributeName: [UIColor blueColor],
NSUnderlineStyleAttributeName: @(NSUnderlineStyleThick),
};
// Make an attributed string
NSAttributedString *attributedString = [[NSAttributedString alloc] initWithString:@"Hello CoreText!" attributes:attrs];
CFAttributedStringRef attributedStringRef = (__bridge CFAttributedStringRef)attributedString;
// Simple CoreText with CTFrameDraw
CTFramesetterRef framesetter = CTFramesetterCreateWithAttributedString(attributedStringRef);
CGPathRef path = CGPathCreateWithRect(self.bounds,NULL);
CTFrameRef frame = CTFramesetterCreateFrame(framesetter,CFRangeMake(0, 0),path,NULL);
//CTFrameDraw(frame, context);
// You can comment the line 'CTFrameDraw' and use the following lines
// draw with CTLineDraw
CFArrayRef lines = CTFrameGetLines(frame);
CGPoint lineOrigins[CFArrayGetCount(lines)];
CTFrameGetLineOrigins(frame, CFRangeMake(0, 0), lineOrigins);
for (int i = 0; i < CFArrayGetCount(lines); i++) {
CTLineRef line = CFArrayGetValueAtIndex(lines, i);
CGContextSetTextPosition(context, lineOrigins[i].x, lineOrigins[i].y);
// CTLineDraw(line, context);
// You can comment the line 'CTLineDraw' and use the following lines
// draw with CTRunDraw
// use CTRunDraw will lost some attributes like NSUnderlineStyleAttributeName,
// so you need draw it by yourself
CFArrayRef runs = CTLineGetGlyphRuns(line);
for (int j = 0; j < CFArrayGetCount(runs); j++) {
CTRunRef run = CFArrayGetValueAtIndex(runs, j);
CTRunDraw(run, context, CFRangeMake(0, 0));
}
}
}
this code will use CTRunDraw to draw the content , and the underline will draw and show normally in iOS17 & Xcode 15 , But when you build it with XCode16 & iOS18 beta . the underline will be missing .
I want to convert CGPoint into SCNVector3. I am using ARFaceTrackingConfiguration for face tracking.
Below is my code to convert SCNVector3 to CGPoint
let point = faceAnchor.verticeAndProjection(to: sceneView, facePoint: faceAnchor.geometry.vertices[0])
print(point, faceAnchor.geometry.vertices[0])
which prints below values
CGPoint = (350.564453125, 643.4456787109375)
SIMD3<Float>(0.014480735, 0.01397189, 0.04508282)
extension ARFaceAnchor{
// struct to store the 3d vertex and the 2d projection point
struct VerticesAndProjection {
var vertex: SIMD3<Float>
var projected: CGPoint
}
// return a struct with vertices and projection
func verticeAndProjection(to view: ARSCNView, facePoint: Int) -> CGPoint{
let point = SCNVector3(geometry.vertices[facePoint])
let col = SIMD4<Float>(SCNVector4())
let pos = SIMD4<Float>(SCNVector4(point.x, point.y, point.z, 1))
let pworld = transform * simd_float4x4(col, col, col, pos)
let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z))
let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y))
return p
}
}
extension matrix_float4x4 {
/// Get the position of the transform matrix.
public var position: SCNVector3 {
get{
return SCNVector3(self[3][0], self[3][1], self[3][2])
}
}
}
Now i want to convert same CGPoint to SCNVector3.
I tried using below code but it is not giving expected values, which is SIMD3(0.014480735, 0.01397189, 0.04508282)
let projectedOrigin = sceneView.projectPoint(SCNVector3Zero)
let unproject = sceneView.unprojectPoint(SCNVector3(point.x, point.y, CGFloat(projectedOrigin.z)))
let vector = SCNVector3(unproject.x, unproject.y, unproject.z)
Is there any way to convert CGPoint to SCNVector3? I cannot use hitTest because this CGPoint is not present on the node. It is present somewhere on the face area.
Good news bad news.
Good news- I have built my first app clip! After getting our app submission accepted, it gave us a working "default app clip url" in which successfully launches our app clip card and app clip.
Bad news- All this work was done to associate our app clip link with our website, so we could have a very clean URL, but that url is not launching our app clip card or the clip. Everything points to it looking good:
Diagnostics on apple developer settings are all green checkmarks: associated domains, app clip published on app store, smart app banner.
My associated domain url is "validated" on app store connect
My website has a smart app banner with meta tag with bundle identifier, and a open graph photo configured.
My app clip has the domain in it's entitlements file
What I'm expecting: Sending a text with the website's url should show me my app clip card, and not open my website, instead the app clip.
I shouldn't need to configure an advanced app clip experience because it's just via Messenger.. right? According to the documentation, advanced experiences should be for maps, qr codes, etc, right?
From what it seems... everything is set up completely right... so how come when I send myself a text message with the website's URL, it's not popping up with the app clip card?
Hello my appclip is showing as unavailable in the app clip card. Until earlier this afternoon my appclip experience was showing correctly with an open action. All of a sudden it started to fail. I did not submit a new version nor updated the association file.
To reproduce scan qr.netflix.com/C/123
The SF Symbols app 6.0 (99) does not export the ‘Can Rotate’ property of layers when exporting a symbol via File > Export Symbol.
Without this, all the new fantastic edit functions in the app related to rotating in SF Symbols is completely useless.
This issue with the SF Symbols 6 app can be reproduced by exporting a rotatable symbol like fan.desk, and then by importing the result as custom symbol. When inspecting ‘Group 1’ of the imported symbol, it is no longer marked as rotatable.
SF Symbols app 6.0 is still in beta, but hasn't been updated since 10 June. Hopefully this bug will be solved in the release version, or earlier.
Does anyone know how to manually add the missing rotation info to the exported SVG file?
In case an Apple engineer reads this: FB13916635
I have an URGENT assignment with detail as below:
I have an application and it has an Extension UI for it.
When I select an item of popup list on standard UI of application.
How can I catch event of this selection in standard UI and do update for Extension UI?
Do we have any relationship between Standard UI and Extension UI in MacOS?
Please help to share your experience about this technical?
Topic:
UI Frameworks
SubTopic:
General
So in our app, we let the user to type in credit card numbers. We wanna enable the user to autofill credit card numbers saved in the safari.
We use the webview to open a link that leads to a PCI-compliant third-party components.
We use WKWebView. If we use SFSafariViewController, then the webview will open the link using real safari, and the user will be able to autofill from safari. But we use WKWebView.
My question is:
Is it possible to enable the user to autofill credit card numbers saved in safari using WKWebView?
Thanks!
Topic:
UI Frameworks
SubTopic:
General
Has anyone been able to create a Control Center widget that opens a snippet view? There are stock Control Center widgets that do this, but I haven't been able to get it to work.
Here's what I tried:
struct SnippetButton: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(
kind: "xxx.xxx.snippetWidget"
) {
ControlWidgetButton(action: SnippetIntent()) {
Label("Show Snippet", systemImage: "map.fill")
}
}
.displayName(LocalizedStringResource("Show Snippet"))
.description("Show a snippet.")
}
}
struct SnippetIntent: ControlConfigurationIntent {
static var title: LocalizedStringResource = "Show a snippet"
static var description = IntentDescription("Show a snippet with some text.")
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
return .result(dialog: IntentDialog("Hello!"), view: SnippetView())
}
}
struct SnippetView: View {
var body: some View {
Text("Hello!")
}
}
Apple's documentation pretty much only says this about ObservableObject: "A type of object with a publisher that emits before the object has changed. By default an ObservableObject synthesizes an objectWillChange publisher that emits the changed value before any of its @Published properties changes.".
And this sample seems to behave the same way, with or without conformance to the protocol by Contact:
import UIKit
import Combine
class ViewController: UIViewController {
let john = Contact(name: "John Appleseed", age: 24)
private var cancellables: Set<AnyCancellable> = []
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
john.$age.sink { age in
print("View controller's john's age is now \(age)")
}
.store(in: &cancellables)
print(john.haveBirthday())
}
}
class Contact {
@Published var name: String
@Published var age: Int
init(name: String, age: Int) {
self.name = name
self.age = age
}
func haveBirthday() -> Int {
age += 1
return age
}
}
Can I therefore omit conformance to ObservableObject every time I don't need the objectWillChange publisher?
I went to update to Apple Intelligence Beta 15.1 on Sequoia yesterday when it dropped, and I’m still waitlisted. The weird thing is, I had it on my Mac within a few minutes after downloading the beta, but then restarted the Mac and it now says “joined waitlist” and still does today.
Topic:
UI Frameworks
SubTopic:
General
My app needs to recognize screen gestures the hard way. I have working code for flick recognition (a combination of distance, direction, and velocity), but it's not as reliable as whatever Apple uses. Does anyone know exactly what defines a flick in iOS?
I have been using UIDocumentInteractionController and presentOptionsMenuFromRect for sharing PDFs for years.
Now I get these errors.
Only support loading options for CKShare and SWY types.
[ERROR] failed to get service endpoint creating for for item at URL
Collaboration: error loading metadata for documentURL:file:
The files I create I believe are fine. I have tried sharing them from a temporary folder and also from a subfolder on the iPad but still get these errors. Any help appreciated.
I have apps that send requests for route between 2 locations and search, filter then display facilities near the route. The apps first send a request for the route on a background thread, then based on route, search for facilities near certain locations on or near the route. There maybe multiple searches on the same route, each on a different location. Suitable results then are displayed on the map. Apps also do live updates.
However, since I have switched to using NSURLSession to search for the route on a background thread, not all suitable results/pin are displayed. Certain pins only show up upon the next didUpdateToLocation call.
So my question is, what is the best practice to sync the results on the UI? Why do only some of the results show up on the UI, and others don't.
In my document-based app for macOS I am using storyboard. I create a custom menu and menu items in storyboard. The menu is there, but the menu item remains inactive (grey).
In order to activate the menu, one has to use NSWindowDelegate and walk the menu tree, to enable the menu items.
import Foundation
import Cocoa
class DocumentController: NSViewController, NSTextFieldDelegate {
@IBOutlet var myOutlet: NSView!
// ... code here ...
}
extension DocumentController: NSWindowDelegate {
func windowDidBecomeMain(_ notification: Notification) {
NSLog("windowDidBecomeMain - enable menu")
if let customMenu = NSApp.mainMenu?.items[2].submenu {
customMenu.item(at: 1)?.isEnabled = true
}
}
func windowDidResignMain(_ notification: Notification) {
NSLog("windowDidResignMain - disable menu")
}
}
I added the outlet (myOutlet) in storyboard, but windowDidBecomeMain() does not get called?
Thanks for any help.
I'm trying to build a Live Activity extension. I can successfully start my live activity via push notification. The problem is, when I start live activity from my app, I can get pushTokenUpdates since I control everything and run the for loop that gets pushTokenUpdates. But the code that gets pushTokenUpdates isn't called when I start live activity with push notification since system starts it automatically (maybe it is called, but I don't know when and where).
Where am I supposed to get pushTokenUpdates when I start Live Activity using push notification to send them to my server?
The relevant code is below:
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
Logger().log("Activity token: \(token)")
// send token to the server
}