In order to create a UITextView like that of the Messages app whose height grows to fits its contents (number of lines), I subclassed UITextView and customized the intrinsicContentSize like so:
override var intrinsicContentSize: CGSize {
var size = super.intrinsicContentSize
if size.height == UIView.noIntrinsicMetric {
layoutManager.glyphRange(for: textContainer)
size.height = layoutManager.usedRect(for: textContainer).height + textContainerInset.top + textContainerInset.bottom
}
return size
}
As noted at WWDC, accessing layoutManager will force TextKit 1, we should instead use textLayoutManager. How can this code be migrated to support TextKit 2?
How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here
Posts under iOS tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
Hi, I'm looking for the best way to use MLX models, particularly those I've fine-tuned, within a React Native application on iOS devices. Is there a recommended integration path or specific API for bridging MLX's capabilities to React Native for deployment on iPhones and iPads?
Hi Apple developer community. I have a question a lot of users don’t like the new control center and notification center. Are you guys gonna blur the background or are you guys gonna keep it the same?
How reliable is the Models, to use as a comparison, such as a cholesterol test, to inform, for example, whether it is worth it to go see a doctor?
I would like to use Tool to attach the simple blood test data to the session and with this the Model can analyse and made a simple suggestion if is necessary to see a doctor etc.. ?
ps.: Local model
After doing a software update I can only view iMessages older than 12hrs (from a single contact) if the other person replies to and older message. Every time I leave the Massages app and go back in the issue occurs again.
Navigation Title no longer showing for first Tab in iOS/iPadOS 26 (Directives) in my app Starship SE Corps when running is Xcode 26 simulator and on iPad device itself running iPadOS 26 beta.
Launch app
Notice Navigation Title “Directives” is missing from top tab in Sidebar and Floating Tab View (iPad) and TabView (iOS).
Navigate to other tabs and Navigation Titles appear as expected.
Worked fine (as expected) in iOS/iPadOS 18.5, but broken in iOS/iPadOS 26.
Reference Feedback: FB17987650
I was recently made aware of a relatively new method on PTChannelManagerDelegate that allows the backend infrastructure to update the PTT service on a device.
incomingServiceUpdatePush(channelManager:channelUUID:pushPayload:isHighPriority:remainingHighPriorityBudget:completion:)
I remember the person stated that there was a specific format that the payload must have, but I am unable to find any documentation about it.
Can somebody point me to documentation on how to send this type of push notification for incomingServiceUpdatePush or give me an example payload? I believe it uses the same PTT push token, but if the headers are different, then please specify those differences too. Thanks!
I pushed a configuration to my iPhone through MDM to run the content filter. However, when I modify the configuration by adding some vendor-configuration , I lose connection to the debugger and can no longer see logs or the updated configuration in Xcode. I have to build the app again. Could this be an issue with Xcode, or is it related to MDM or the configuration itself?
I use .scaleEffect(x: 1, y: -1, anchor: .center) to reverse the messages list, so that the latest message always at the bottom.
This is correct in ios18, but blurred the whole view in ios26.
Complete code:
ScrollView {
Rectangle()
.fill(.clear)
.frame(height: 10)
// if messages.isEmpty {
// MessagesEmpty()
// .padding(.horizontal, 10)
// .scaleEffect(x: 1, y: -1, anchor: .center)
// }
MessageInput(chat: chat)
.padding(.horizontal, 10)
.scaleEffect(x: 1, y: -1, anchor: .center).id("#messag-input-identifier")
LazyVStack(spacing: 10) {
ForEach(messages) { (message: Message) in
MessageItem(message: message,
activation: $activeMessageId,
audioAdapter: AudioAdapter.shared).id(message.id)
}
.padding(.horizontal, 10)
.scaleEffect(x: 1, y: -1, anchor: .center)
}
Rectangle()
.fill(.clear)
.frame(height: 20)
}
.scaleEffect(x: 1, y: -1, anchor: .center)
As shown in the screenshot WechatIMG49.jpg(using ios26beta which is incorrect), WechatIMG50.jpg(using ios18 which is correct) my messages list displays normally on iOS 18, but when using iOS 26 Beta, the entire view is blurred.
IOS 26 Beta(Error):
IOS 18(Correct):
Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
.accessibilityRotor(.headings) {
ForEach(entries) { entry in
// entry.id must be the same as the id of the SwiftUI view it is about
AccessibilityRotorEntry(entry.name, id: entry.id)
}
}
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
I'm pretty excited for the all new iOS 26 redesign, but what I'm most excited for is the call screening feature to avoid unwanted calls.
I downloaded the developer's beta on my secondary device to report bugs, but call screening doesn't work on it even though it's turned on. I already sent a feedback report, but I'm not sure if it's just a beta bug or that my iPhone won't support it. I haven't found much info about it.
It's a 2nd gen SE.
Thanks.
Hi, I’ve recently installed the iOS 26 DB, and I’m experiencing heating issues on my iPhone. Is there a fix to this? Also, is it a bug or a feature, that I can’t remove certain sections in the Photos app rather than rearranging them. Pls lemme know. Thanks.
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
Hi community my name is Adam Robles. I have a question. I recently downloaded the beta on my iPad Pro 13 inch and I noticed where are the iPad wallpapers I only get one wallpaper, but where what happened to the wallpaper that was released with iPad last year
been testing out ios 26 since it came out and now all of a sudden everytime it boots only into recovery and wont do anything else. recovery says no known issues found.
Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
I have an app that uses NSPersistentCloudKitContainer stored in a shared location via App Groups so my widget can fetch data to display. It works. But if you reset your iPhone and restore it from a backup, an error occurs:
The file "Name.sqlite" couldn't be opened. I suspect this happens because the widget is created before the app's data is restored. Restarting the iPhone is the only way to fix it though, opening the app and reloading timelines does not. Anything I can do to fix that to not require turning it off and on again?
I have non-consumable and consumable in-app purchases in my app. The tutorial I was following stated Transaction.currentEntitlements includes unfinished consumables, which is incorrect according to the documentation. Is the correct way to handle unfinished consumables (and non-consumables) to implement Transaction.updates and call finish() if it’s verified? The documentation says that listener will receive unfinished transactions once upon app launch, so with that, do I understand correctly you do not need to implement Transaction.unfinished unless you want to look for unfinished transactions manually later on? Otherwise what is the correct and most recommended way to handle unfinished consumables? Is there a way to test that scenario in Xcode?
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?