I was wondering when are you launching IOS BETA 4?
Thanks for your support
Apple Developers
RSS for tagThis is a dedicated space for developers to connect, share ideas, collaborate, and ask questions. Introduce yourself, network with other developers, and foster a supportive community.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello Community and Dear Apple Developer Community,
I shouldn’t have done it… I was already using Sequoia Beta 3 on my Mac Studio. Everything was running stable, and I had been able to work productively with it for a while (weeks).
Recently, I decided to enable FileVault. Said and done, but unfortunately, something went wrong: since then, I can no longer log in with my normal user account (which has admin rights). Instead, only the guest user appears on the login screen (and no other account appears even when hovering over it). When I log in as this guest user, it behaves differently than usual. Only Safari can be started, and the only other things I can do are shut down or restart the system. Something has gone seriously wrong with the beta, I would say. Of course, I know I use a beta at my own risk, but I would still very much like to get back to working properly on my Mac without having to reinstall everything.
Who can help? I’ve always managed to solve issues myself so far, but in this case, I’m not sure where to start. I can still open a terminal from the recovery utilities and see the SSD utilization, etc. I can also navigate the file system to a basic extent, but I don’t see any entries under Applications or Users with ‘ls -la’, probably because the file systems are mounted with ‘nobrowse’. A current mount entry of the actual boot disk looks like this:
/dev/disk3s1 on /Volumes/Macintosh HD (apfs, sealed, local, read-only, journaled, nobrowse)
My questions are:
1. Is it sufficient to mount the drives with different options to access them properly again?
2. How do I solve the “only Safari” problem of the guest user?
3. How can I re-enable my existing real user (especially since it is an admin)?
4. Can I disable FileVault from the recovery terminal? If so, how?
5. Would a Sequoia boot stick help me? For Silicon Macs, it is supposedly not that simple but doable.
It’s also important to note that I have no access to any GUI, as I can only log in as a guest user with Safari. However, I still have other Macs to fall back on (including a MacBook Air M1).
If anyone knows something about my problem, I would appreciate constructive responses.
Has anyone else had similar experiences with the Sequoia Beta (now Public Beta)?
Thank you for your help,
Marcus
Topic:
Community
SubTopic:
Apple Developers
I'm developing a macOS application using Swift and a camera extension. I'm utilizing the Vision framework's VNGeneratePersonSegmentationRequest to apply a background blur effect. However, I'm experiencing significant lag in the video feed. I've tried optimizing the request, but the issue persists. Could anyone provide insights or suggestions on how to resolve this lagging issue?
Details:
Platform: macOS
Language: Swift
Framework: Vision
code snippet I am using are below
`class ViewController: NSViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
var frameCounter = 0
let frameSkipRate = 2
private let visionQueue = DispatchQueue(label: "com.example.visionQueue")
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
frameCounter += 1
if frameCounter % frameSkipRate != 0 {
return
}
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
performPersonSegmentation(on: ciImage) { [self] mask in
guard let mask = mask else { return }
let blurredBackground = self.applyBlur(to: ciImage)
let resultImage = self.composeImage(with: blurredBackground, mask: mask, original: ciImage)
let nsImage = ciImageToNSImage(ciImage: resultImage)
DispatchQueue.main.async { [self] in
// Update your NSImageView or other UI elements with the composite image
if needToStream {
if (enqueued == false || readyToEnqueue == true), let queue = self.sinkQueue {
enqueued = true
readyToEnqueue = false
if let _ = image, let cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) {
enqueue(queue, cgImage)
}
}
}
}
}
}
private func performPersonSegmentation(on image: CIImage, completion: @escaping (CIImage?) -> Void) {
let request = VNGeneratePersonSegmentationRequest()
request.qualityLevel = .fast // Adjust quality level as needed
request.outputPixelFormat = kCVPixelFormatType_OneComponent8
let handler = VNImageRequestHandler(ciImage: image, options: [:])
visionQueue.async {
do {
try handler.perform([request])
guard let result = request.results?.first as? VNPixelBufferObservation else {
completion(nil)
return
}
let maskPixelBuffer = result.pixelBuffer
let maskImage = CIImage(cvPixelBuffer: maskPixelBuffer)
completion(maskImage)
} catch {
print("Error performing segmentation: \(error)")
completion(nil)
}
}
}
private func composeImage(with blurredBackground: CIImage, mask: CIImage, original: CIImage) -> CIImage {
// Invert the mask to blur the background
let invertedMask = mask.applyingFilter("CIColorInvert")
// Ensure mask is correctly resized to match original image
let resizedMask = invertedMask.transformed(by: CGAffineTransform(scaleX: original.extent.width / invertedMask.extent.width, y: original.extent.height / invertedMask.extent.height))
// Blend the images using the mask
let blendFilter = CIFilter(name: "CIBlendWithMask")!
blendFilter.setValue(blurredBackground, forKey: kCIInputImageKey)
blendFilter.setValue(original, forKey: kCIInputBackgroundImageKey)
blendFilter.setValue(resizedMask, forKey: kCIInputMaskImageKey)
return blendFilter.outputImage ?? original
}
private func ciImageToNSImage(ciImage: CIImage) -> NSImage {
let cgImage = context.createCGImage(ciImage, from: ciImage.extent)!
return NSImage(cgImage: cgImage, size: ciImage.extent.size)
}
private func applyBlur(to image: CIImage) -> CIImage {
let blurFilter = CIFilter.gaussianBlur()
blurFilter.inputImage = image
blurFilter.radius = 7.0 // Adjust the blur radius as needed
return blurFilter.outputImage ?? image
}
}`
Hi everyone,
I am a beginner in Swift and I am currently using DeviceActivityMonitor to develop an app for parents to control their children's app usage time. During the development process, I found that the DeviceActivityMonitor has a 6MB memory limit in the background, and my app exceeds this limit after running for a long time. I would like to know how to reduce the memory usage of DeviceActivityMonitor in the background, or where I can check the memory usage of DeviceActivityMonitor and see which variables are consuming memory.
Additionally, I want to know if a new instance of the DeviceActivityMonitor class is created every time it is called?
Thank you for your help!
My screen keeps going out ever since I updated to iOS 18. I can use Bluetooth without it happenin, but not CarPlay
Topic:
Community
SubTopic:
Apple Developers
Olá a Apple rejeitou meu app, porque disse que após a splash screen a tela fica branca por muito tempo e não apareceu o conteúdo, tem como diminuir esse tempo da tela branca?
Topic:
Community
SubTopic:
Apple Developers
There is definitely a glitch with the latest update of iOS 18 Beta4 with Google Maps in carplay mode.
Google Maps on the car screen is almost responsive. It does not recognize your home or work address.
While there might be a workaround to get to your destination.
Use Siri to get directions, but speak the full address and tell it to give directions using Google Maps.
For e.g. "Hey Siri, get me directions to "XX LOCATION" using Google Maps"
You can use Apple Maps and Waze.
But its hard for people to use another navigation app rather than using what they are used to.
I find Google Maps really easy to use and handy, and now, with the latest update of Google Maps you can update live incidents, accidents, slowdowns and traffic reports from your dashboard
Too many irrelevant posts come up when I attempt to search for something. There should be an easy button or something to earmark posts as something irrelevant- something that shouldn't come up in a search. I'm looking for something in Xcode 15 and frequently the top posts in the search are 10 years old and don't have any relevance whatsoever to the solution to my problem.
Topic:
Community
SubTopic:
Apple Developers
Hi All,
Did some searching and found some wildly differing answers: how long did it take for your developer account to be approved?
Topic:
Community
SubTopic:
Apple Developers
How do I correctly show a PDF document?
iPad and Xcode 15.4
Within my GameViewController, I have:
func presentScene(_ theScene: SKScene) {
theScene.scaleMode = .resizeFill
if let skView = self.view as? SKView {
skView.ignoresSiblingOrder = true
skView.showsFPS = true
skView.showsNodeCount = true
#if os(iOS)
let theTransition = SKTransition.doorway(withDuration: 2.0)
skView.presentScene(theScene, transition: theTransition)
#elseif os(tvOS)
skView.presentScene(theScene)
#endif
}
} // presentScene
I believe presentScene(theScene) goes to the sceneDidLoad() func of theScene which adds various SKSpriteNodes to the scene via a call to addChild(theNode).
So far so good ...
Until I have a SKScene wherein I wish to display a PDF.
I use this snippet to display this PDF with this call within the SKScene's sceneDisLoad():
displayPDF("ABOUT_OLD_WEST_LOCOMOTIVE")
func displayPDF(_ itsName:String) {
let pdfView = PDFView()
guard let path = Bundle.main.path(forResource: itsName, ofType: "pdf")
else { return }
print("path")
guard let pdfDocument = PDFDocument(url: URL(fileURLWithPath: path))
else { return }
print("pdfDocument")
pdfView.displayMode = .singlePageContinuous
pdfView.autoScales = true
pdfView.displayDirection = .vertical
pdfView.document = pdfDocument
} // displayPDF
The 2 print statements show, yet the SKScene does not display the PDF via pdfView.document = pdfDocument?
Anyone have a clue what errors I have committed?
Appreciate it.
Hi,
I am a newbie to Apple development. I do have good experience in PC development. What resources would I use to begin learning? I am planning to develop an app for personal interests.
I know there are online courses on Coursera and such. But am hoping to do it at low cost and am not interested in certification for job search, etc.
Thanks much.
I know there is plenty of time left but if you could get iOS 18 beta to go to a black screen with gray wheel less, I’d be appreciative. It happened three times while writing this. Of course I entered a bug for it. Please and thank you. I am running developer build 4.
It appears that even on iOS 18 beta 4, the deployment of a share extension target to a physical device is broken. Has anyone seen the message, "The specified capability is not supported by this device"? Just a basic iOS app, no logic defined at this point. It builds fine. The main target gets deployed fine, only when attempting to deploy the share extension do I see this message. Works fine on a simulator, just not on physical device. iOS 18 beta 4, Xcode 16 beta 4 and MacOS 15 beta.
Topic:
Community
SubTopic:
Apple Developers
Hello guys, I cannot agree the transfer account holder agreement, the steps are shown below
Steps:
I went to Apple Developer website, clicked "Account" button and saw the message
"""
xxx has requested that we transfer the Account Holder role to you.
You will become the new Account Holder for your organization and will assume the responsibilities of accepting all legal agreements, managing App Store submissions, and renewing your membership. Before you can accept, you’ll need to complete identity verification in the Apple Developer app.
"""
I then went to Apple Developer app, clicked the Account Tap, and saw no agreement popping up. I only saw the un-clickable "Enroll now" in "Apple Developer Program", but I already filled in the personal information for identity verification several weeks ago, I resubmitted the personal information by contacting them, however, there is no updates on this from their side.
Do you guys have any solution? I need to deploy the app as soon as possible for an upcoming event, therefore it's an urgency for the company, thanks so much for your help.
Topic:
Community
SubTopic:
Apple Developers
I would like some information, I would like to know the diagonal dimensions of the rear glass panel where the cameras for the iPhone 15 Pro Max are placed
Topic:
Community
SubTopic:
Apple Developers
Just bought a refurbished iPad for our son. Created an Apple ID for this device. When we enter the Apple ID and password we get a Verification Failed message that reads “verification codes can’t be sent to this phone number at this time.” This is an WiFi only device and we have not connected it to a phone number. Any help?
Hello everyone,
I am struggling to find a solution for the following problem, and I would be glad and thankful if anyone can help me.
My Use Case:
I am using RoomPlan to scan a room. While scanning, there is a function to take pictures. The position from where the pictures are taken will be saved (in my app, they are called "points of interest" = POI).
This works fine for a single room, but when adding a new room and combining the two of them using:
structureBuilder.capturedStructure(from: capturedRooms)
the first room will be transformed and thus moved around to fit in the world space.
The points are not transformed with the rest of the room since they are not in the rooms structure specifically, which is fine, but how can I transform the POIs too, so that they are in the correct positions where they were taken?
I used:
func captureSession(_ session: RoomCaptureSession, didEndWith data: CapturedRoomData, error: (Error)?)
to get the transform matrix from "arFrameReferenceOriginTransform" and apply this to the POIs, but it still seems that this is not enough.
I would be happy for any tips and help!
Thanks in advance!
My Update function:
func updatePOIPositions(with originTransform: simd_float4x4) {
for i in 0..<(poisOldRooms.count) {
var poi = poisOldRooms[i]
let originalPosition = SIMD4<Float>(
poi.data.cameraOriginX,
poi.data.cameraOriginY,
poi.data.cameraOriginZ,
1.0
)
let updatedTransform = originTransform * originalPosition
poisOldRooms[i].data.cameraX = updatedTransform.x
poisOldRooms[i].data.cameraY = updatedTransform.y
poisOldRooms[i].data.cameraZ = updatedTransform.z
}
}
I am using Instagram Basic Display API in my app, do I have to submit the app to Facebook first before submitting it to Apple?
Or I can submit app to Apple first and provide test account for Apple reviewer?
Topic:
Community
SubTopic:
Apple Developers
Do you think Apple could make the rear cameras even bigger or will they remain the same size as the 15 pro max?
Topic:
Community
SubTopic:
Apple Developers
In situations where the app receives a VoIP push, and the user starts to answer by sliding, the call is initiated and the timer starts. However, due to network issues, the app's call may not be fully ready, resulting in a delay of 5-10 seconds before the actual call begins. Is there a way to display a "loading" or "connecting" indicator on the CallKit interface during this wait time?