Not a line-by-line code dump. A senior-developer walkthrough of the design decisions, the iOS patterns that make it feel native, and the coding attitude that keeps the codebase shippable at the end. Every step shows the live render on the right as you read.
Before we make anything pretty, we make it predictable. Here's how a senior dev opens a blank SwiftUI project and doesn't paint themselves into a corner on day one.
Muse is a dating app, so the feel has to come first: warm, personal, premium, calm. Not loud. Not gamified. Every colour choice, every motion, every piece of copy refers back to this sentence.
We pick a palette that carries the feel — a pink/purple gradient for the brand and love metaphors, off-white neutrals everywhere else. Anything we're tempted to add that doesn't serve warm, personal, premium, calm gets cut.
Write the one-sentence feel at the top of your README. When you're debating a colour or a transition two weeks from now, that sentence is the judge — not your taste.
We use the boring, canonical SwiftUI layout: Model/, View/, ViewModel/, plus Assets.xcassets/. No Composable Architecture. No Redux. Nothing we'd have to defend in code review.
// Shared/ // MuseApp.swift — entry point // ContentView.swift — one line: Home() // Model/ // Profile.swift — the data // Category.swift — tab groups // View/ // Home.swift — parallax + sections // HeaderView.swift — top chrome // CardView.swift — one profile card // ViewModel/ // HomeViewModel.swift — scroll offset + selected tab
Two weeks from now when a new developer opens this project, they should know where anything lives in under ten seconds. That's the goal.
Most tutorials write name: String, age: Int and move on. A senior dev thinks about what each field does to the UI before adding it:
struct Profile: Identifiable {
var id = UUID().uuidString
var name: String
var age: Int
var distanceKm: Double
var bio: String
var interests: [String] // three, exactly
var matchPercent: Int // drives the ring
var isVerified: Bool
var isOnline: Bool
var lastActive: String // "Active now" / "2h ago"
var image: String
}A field that doesn't pay rent on screen gets cut. Every property here drives a visible detail: the chip, the dot, the ring, the line height. No silent fields.
The view model only needs to know two things: how far the user has scrolled (drives the entire top chrome) and which tab is currently in the viewport (drives the chip selection).
class HomeViewModel: ObservableObject {
@Published var offset: CGFloat = 0
@Published var selectedtab = categoryItems.first!.tab
}No fetch logic. No cache. No speculative state. When we need any of those, we'll add them — and regret them only if we added them before we needed them.
A ViewModel is a drawer, not a warehouse. If you're stuffing it with things the view doesn't bind to, you've built a ball of mud.
None of this renders anything interesting yet. That's the point — every hour here buys ten hours of "oh I just drop a view in and it works" later. The next six chapters all assume this scaffolding is trustworthy.
Next up we build the card. This is where the fun starts.
Pro unlocks Muse, Swift Intro, SwiftUI Intro, API Deep Dive, AVFoundation, and every upcoming project build.
Subscribe for $7.99/moA card is an argument. Every element is trying to convince the user to tap it. Here's how we stack them so the argument actually lands.
Before touching SwiftUI, sketch the card as a 2×3 grid: Photo, Identity, Meta, Bio, Signals, Action. If a candidate element doesn't fit a zone, it doesn't belong on the card.
Chips are a classic overreach point in dating-app design. Five categorical colours and the card looks like a slot machine. We use one pink/purple gradient at 18% opacity with a slightly darker border. Visual consistency reads as premium.
.padding(.horizontal, 8).padding(.vertical, 4)
.background(
Capsule().fill(
LinearGradient(colors: [.pink.opacity(0.18), .purple.opacity(0.18)],
startPoint: .leading, endPoint: .trailing)
)
)
.overlay(Capsule().stroke(.pink.opacity(0.25), lineWidth: 0.5))
.foregroundColor(.pink)Three reading levels: glance (ring fill), skim (bold percentage), read(label). This is a classic iOS affordance — see Apple's Activity rings.
Circle()
.trim(from: 0, to: fraction)
.stroke(
AngularGradient(colors: [.pink, .purple, .pink],
center: .center),
style: StrokeStyle(lineWidth: 4, lineCap: .round)
)
.rotationEffect(.degrees(-90))Angular gradient + trim on a Circle = Apple's Activity Ring recipe. You'll reach for it for progress, confidence, battery — anything where a percentage wants a shape.
Off: outline on soft pink tint. On: filled pink/purple gradient and 10% larger. Spring response 0.35, damping 0.55 — snappy but not jittery. A little overshoot makes it feel like it's reacting emotionally.
.scaleEffect(liked ? 1.1 : 1.0) .animation(.spring(response: 0.35, dampingFraction: 0.55), value: liked)
.animation(.default) is the "I'll think about this later" animation. Pick a spring with real response/damping for anything emotional.
.secondarySystemBackground, 22pt continuous rounded rectangle, 1pt .pink.opacity(0.08)border, soft black shadow (opacity 0.04, radius 8, y 4). That shadow is what sells "floating" without the card trying.
The top of the screen has ten seconds to earn a scroll. Here's the machine that does it.
A GeometryReader around the hero reads reader.frame(in: .global).minY. Positive (pulled down): stretch. Negative (scrolling up): translate at the same rate, so the image parallaxes behind the content.
.frame(height: 360 + (offset > 0 ? offset : 0)) .offset(y: (offset > 0 ? -offset : 0))
Layer two linear gradients: a top-to-clear at 55% black (for the status bar) and a clear-to-bottom at 45% (for the title). The middle stays transparent so the subject breathes.
LinearGradient(
colors: [.black.opacity(0.60), .clear, .clear, .black.opacity(0.45)],
startPoint: .top, endPoint: .bottom
)Font.system(size: 44, weight: .heavy) with a single soft black shadow at 30% opacity. If your title needs more than one shadow, your gradient is too weak.
"Featured Today" uses a solid gradient. Beside it, "Online · 2 km" uses .ultraThinMaterial like a frosted tag clipped onto the photo. Two distinct styles that still belong to the same family — iOS does this constantly (Control Center toggles vs pill labels).
The bar at the top of a social app is its hardest single component. Here's how we ship one that doesn't look like a web banner.
SwiftUI's pinnedViews: [.sectionHeaders] pins below the safe area and always looks like a banner strip. For chrome that extends under the status bar, put it in a ZStack overlay above the scroll view and drive it with scroll offset.
ZStack(alignment: .top) {
ScrollView { … }
.ignoresSafeArea(.container, edges: .top)
HeaderView()
}Our first pass had two rows. Empty centre in the wordmark row looked cheap. Merged: tiny wordmark left, horizontally scrolling chip strip centre, icons right. 48pt tall. One row.
HStack(spacing: 10) {
wordmark.opacity(easedT)
chipStrip.opacity(easedT)
iconButton("bell")
iconButton("suit.heart.fill")
}A horizontal scroll view with hard edges inside a nav bar reads as "cut off." Masking both ends with an alpha gradient turns the cut-off into a "more to see" affordance.
.mask(
LinearGradient(
colors: [.clear, .black, .black, .black, .clear],
startPoint: .leading, endPoint: .trailing
)
)At the top, only two frosted icons float over the photo. As the user scrolls, a .regularMaterial backdrop fades in (opacity tied to scroll), with a 5% pink/purple tint. The 0.5pt hairline at the bottom eases in only when fully materialised.
Apple Music "Listen Now," Maps' location sheet, Photos' nav — all the same recipe: transparent at rest, material on scroll, hairline at full. Your users already know this language — speak it.
Motion is not decoration. Motion is information. Here's how to use the minimum amount of it to sell the maximum amount of polish.
Number-one sign of an amateur UI: every element animates with a slightly different duration and curve. Fix is embarrassingly simple: compute one t: CGFloat from offset, smoothstep it, use easedT everywhere. In sync by construction.
var t: CGFloat {
let raw = (homeData.offset - 140) / 140
return min(max(raw, 0), 1)
}
var easedT: CGFloat { t * t * (3 - 2 * t) } // smoothstepThe indicator capsule glides between tabs because we use matchedGeometryEffect(id: "chipBG", in: chipNS). Nothing animates explicitly — SwiftUI interpolates size and position because both "from" and "to" chips share the namespace.
Same recipe as Apple's segmented control indicator and Music's "now playing." Once you see it you can't un-see it.
The green halo ripples outward and fades. .repeatForever(autoreverses: false) so it always grows from small to large — shrink-back makes pulsing UI look broken.
withAnimation(.easeOut(duration: 1.4).repeatForever(autoreverses: false)) {
pulse = true
}Scroll-driven animations are already smoothed by the scroll system — stacking a spring on top looks drunk. State crossfades want .easeInOut(0.25–0.3), not a spring. Springs belong on discrete state changes (the Like button).
Separate from motion and layout. This is the pass that turns a shippable prototype into an app someone's grandmother, VoiceOver user, or chronic-migraine sufferer can actually use.
A fresh SwiftUI view with ten subviews produces ten VoiceOver stops. For a card, that's grating. Combine everything into a single element with a deliberate label, and expose real actions for the Rotor.
.accessibilityElement(children: .combine)
.accessibilityLabel(accessibilitySummary) // "Mei Lin, 26, verified, 94% match, ..."
.accessibilityAddTraits(.isButton)
.accessibilityAction(named: liked ? "Unlike" : "Like") {
toggleLike()
}Decorative children (gradients, shadows, the pulsing halo) get .accessibilityHidden(true) so the summary reads cleanly.
If the card has a primary action (Like), expose it as an accessibilityAction. The Rotor user shouldn't have to go hunting for a 36×36 button — they should just rotate to "Actions" and hear "Like."
Reading copy — bios, metadata, chip labels — uses the semantic scale (.caption2, .caption, .footnote) so it tracks the user's Dynamic Type setting. Display type — the big hero name — stays pt-locked so the visual rhythm survives, with a safety net:
Text("Mei Lin, 26")
.font(.system(size: 44, weight: .heavy))
.minimumScaleFactor(0.7) // shrink rather than truncate
.lineLimit(1)Numerals (match percentages, timers, prices) get .monospacedDigit()so layout doesn't jitter when digits change width.
The heuristic: anything the user reads scales; anything the user sees as a graphic stays fixed with a fallback. A headline at 44pt is a graphic. A 12pt label is reading.
Reduce Motion isn't a hint — it's a medical accommodation for vestibular disorders. Halve-the-duration is the wrong fix. The right fix is to remove the effect and present the final state.
@Environment(\.accessibilityReduceMotion) private var reduceMotion
// Parallax hero: zero the stretch + translate under reduce motion
let stretch = reduceMotion ? 0 : max(0, offset)
let translate = reduceMotion ? 0 : (offset > 0 ? -offset : 0)
// Online pulse: skip the repeatForever halo entirely
if !reduceMotion {
Circle().fill(Color.green.opacity(0.35)) // halo
.frame(width: pulse ? 22 : 12, ...)
}
// Header crossfade: pass a nil animation to make it instant
.animation(reduceMotion ? nil : .easeInOut(0.3), value: easedT)The static dot still communicates "online"; the ring without parallax still communicates "hero." Nothing is lost except motion.
iOS 17 added a declarative hook for haptics: it fires whenever the trigger value changes. Use it where a user decides something — Like, select a tab, complete a step.
// On the Like button — medium impact thud .sensoryFeedback(.impact(weight: .medium), trigger: liked) // On the nav chrome root — softer selection click as the tab changes .sensoryFeedback(.selection, trigger: homeData.selectedtab)
Don't over-haptic. Every tap is not a state change. A haptic on scroll or on hover is spam. Reserve them for moments the user genuinely caused a decision to land — liking, selecting, committing — and they become an unmistakable signature of polish.
The classic SwiftUI parallax tutorial does this inside a GeometryReader:
GeometryReader { reader -> AnyView in
let offset = reader.frame(in: .global).minY
if -offset >= 0 {
DispatchQueue.main.async {
self.homeData.offset = -offset // ⚠️ side effect in render
}
}
return AnyView(heroImage(offset: offset))
}It works, but it writes to observable state while the view is building. SwiftUI emits "Modifying state during view update" in debug. Cure:
GeometryReader { reader in
let y = reader.frame(in: .global).minY
Color.clear
.onChange(of: y) { _, newY in
if -newY >= 0 { homeData.offset = -newY }
}
.overlay(heroImage(offset: y))
}Same effect, zero render-time mutation, and it plays nicely with iOS 17's strict runtime checks.
On iOS 18+ the cleanest form is .onScrollGeometryChange(for: CGFloat.self) { $0.contentOffset.y } action: ... — one-liner, no GeometryReader needed. Fall back to the overlay trick for iOS 17.
Tutorials usually end when the last animation lands. Real projects ship images. Here's the pipeline we use.
Submit prompts to an image API, save the clean output to assets/generated/original/. We nevership those. They're master copies — if the model version changes, we can still reconstruct exactly what we shipped.
Every prompt is saved beside the image as <name>.prompt.json with model, dimensions, and text. Reproducibility as a habit.
A good watermark holds up on any background. White at 55% alpha, 3-pixel soft black shadow drop. Padding is 1.5% of image width so it scales with the asset. Size floors at 11pt so it never disappears.
# scripts/watermark.py
for dx, dy in [(1,1), (0,1), (1,0)]:
draw.text((x+dx, y+dy), MARK, font=font, fill=(0,0,0,120))
draw.text((x, y), MARK, font=font, fill=(255,255,255,140))Pipeline wipes stale placeholder JPGs inside each .imageset, copies the watermarked PNG in, rewrites Contents.json. Re-runnable, idempotent, never half-installs.
If a human has to drag files into Xcode, the pipeline is incomplete. Every asset-delivery step should be a shell script that takes you from fresh clone to shippable binary.
When every element reports its state honestly through motion, typography, and material, shipping is the easy part. Build Release, screen-record the scroll flow, post it.
That's Muse. Seven chapters, thirty-one steps. You now have every pattern for the next twenty projects.
Pro gives you unlimited reads across Swift, SwiftUI, AVFoundation, API Deep Dive, Promotion, and every Muse-style project build — plus the downloadable source of every project tutorial.
Go Pro — $7.99/mo