UI not updating during render

I've coded a small raytracer that renders a scene (based on Peter Shirley's tutorial, I just coded it in Swift). The raytracer itself works fine, outputs a PPM file which is correct. However, I was hoping to enclose this in a UI that will update the picture as each pixel value gets updated during the render. So to that end I made a MacOS app, with a basic model-view architecture.

Here is my model:

//
//  RGBViewModel.swift
//  rtweekend_gui
//
//

import SwiftUI

// RGB structure to hold color values
struct RGB {
    var r: UInt8
    var g: UInt8
    var b: UInt8
}

// ViewModel to handle the RGB array and updates
class RGBViewModel: ObservableObject {
    // Define the dimensions of your 2D array
    let width = 1200
    let height = 675
    
    // Published property to trigger UI updates
    @Published var rgbArray: [[RGB]]
    
    init() {
        // Initialize with black pixels
        rgbArray = Array(repeating: Array(repeating: RGB(r: 0, g: 0, b: 0), count: width), count: height)
    }
    
    func render_scene() {
                for j in 0..<height {
            for i in 0..<width {
                
                // Generate a random color
                let r = UInt8.random(in: 0...255)
                let g = UInt8.random(in: 0...255)
                let b = UInt8.random(in: 0...255)
                
                // Update on the main thread since this affects the UI
                DispatchQueue.main.async {
                    // Update the array
                    self.rgbArray[j][i] = RGB(r: r, g: g, b: b)
                }
            }
        }
}

and here is my view:

//
//  RGBArrayView.swift
//  rtweekend_gui
//
//

import SwiftUI

struct RGBArrayView: View {
    // The 2D array of RGB values
    @StateObject private var viewModel = RGBViewModel()
    
    // Control the size of each pixel
    private let pixelSize: CGFloat = 1
    
    var body: some View {
        VStack {
            // Display the RGB array
            Canvas { context, size in
                for y in 0..<viewModel.rgbArray.count {
                    for x in 0..<viewModel.rgbArray[y].count {
                        let rgb = viewModel.rgbArray[y][x]
                        let rect = CGRect(
                            x: CGFloat(x) * pixelSize,
                            y: CGFloat(y) * pixelSize, 
                            width: pixelSize, 
                            height: pixelSize
                        )
                        
                        context.fill(
                            Path(rect),
                            with: .color(Color(
                                red: Double(rgb.r) / 255.0,
                                green: Double(rgb.g) / 255.0,
                                blue: Double(rgb.b) / 255.0
                            ))
                        )
                    }
                }
            }
            .border(Color.gray)
            
            // Button to start filling the array
            Button("Render") {
                viewModel.render_scene()
            }
            .padding()
        }
        .padding()
        .frame(width: CGFloat(viewModel.width) * pixelSize + 40, 
               height: CGFloat(viewModel.height) * pixelSize + 80)
    }
}

// Preview for SwiftUI
struct RGBArrayView_Previews: PreviewProvider {
    static var previews: some View {
        RGBArrayView()
    }
}

The render does work and the image displays, however, I thought I set it up to show the image updating pixel by pixel and that doesn't happen, the image shows up all at once. What am I doing wrong?

UI not updating during render
 
 
Q