Creating a minimalist iPhone camera app using Swift and SwiftUI involves several key components:
- Setting Up the Project: Configure the project with necessary permissions.
- Creating the Camera Interface: Display the camera preview.
- Capturing Photos: Implement functionality to take and save photos.
- Handling Permissions: Ensure the app requests and handles camera access appropriately.
Below is a step-by-step guide with the complete code to create a minimalist camera app ready for the App Store.
1. Project Setup
a. Create a New SwiftUI Project
- Open Xcode and select File > New > Project.
- Choose App under iOS and click Next.
- Name your project (e.g.,
MinimalCameraApp
), ensure Interface is set to SwiftUI, and Language is Swift. - Click Next and save your project in the desired location.
b. Configure Permissions
To access the camera, you must declare the usage description in your app’s Info.plist
.
- In the Project Navigator, select Info.plist.
- Add a new key:
Privacy - Camera Usage Description
(NSCameraUsageDescription
). - Provide a description, e.g.,
"This app requires access to the camera to take photos."
Optionally, if you intend to save photos to the user’s photo library, add:
Privacy - Photo Library Additions Usage Description
(NSPhotoLibraryAddUsageDescription
):"This app saves photos to your photo library."
2. Implementing the Camera Functionality
We’ll use AVFoundation to manage the camera session and SwiftUI for the user interface.
a. Create a Camera Model
Create a new Swift file named CameraModel.swift
and add the following code:
import Foundation
import AVFoundation
import SwiftUI
import Photos
class CameraModel: NSObject, ObservableObject, AVCapturePhotoCaptureDelegate {
// Published properties to update the UI
@Published var session = AVCaptureSession()
@Published var alert = false
@Published var output = AVCapturePhotoOutput()
@Published var preview: UIImage?
override init() {
super.init()
checkPermissions()
}
func checkPermissions() {
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized:
setup()
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setup()
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
default:
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
func setup() {
DispatchQueue.main.async {
self.session.beginConfiguration()
// Setup input
guard let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
self.alert.toggle()
return
}
do {
let input = try AVCaptureDeviceInput(device: device)
if self.session.canAddInput(input) {
self.session.addInput(input)
}
} catch {
self.alert.toggle()
return
}
// Setup output
if self.session.canAddOutput(self.output) {
self.session.addOutput(self.output)
self.output.isHighResolutionCaptureEnabled = true
}
self.session.commitConfiguration()
self.session.startRunning()
}
}
func takePic() {
let settings = AVCapturePhotoSettings()
settings.isHighResolutionPhotoEnabled = true
output.capturePhoto(with: settings, delegate: self)
}
// AVCapturePhotoCaptureDelegate method
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
if let error = error {
print("Error capturing photo: \(error)")
return
}
guard let imageData = photo.fileDataRepresentation(),
let uiImage = UIImage(data: imageData) else {
return
}
// Save to photo library
savePhoto(image: uiImage)
DispatchQueue.main.async {
self.preview = uiImage
}
}
func savePhoto(image: UIImage) {
// Request authorization to save to Photo Library
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
// Save image
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if let error = error {
print("Error saving photo: \(error)")
}
}
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
}
}
Explanation:
- CameraModel manages the camera session using
AVCaptureSession
. - It checks for camera permissions and requests access if not determined.
- Sets up the camera input and output.
- Captures photos and saves them to the user’s photo library.
- Publishes changes to update the UI accordingly.
b. Create a Camera Preview
To display the camera feed within SwiftUI, we need a UIViewRepresentable
that wraps an AVCaptureVideoPreviewLayer
.
Create a new Swift file named CameraPreview.swift
and add:
import SwiftUI
import AVFoundation
struct CameraPreview: UIViewRepresentable {
@ObservedObject var camera: CameraModel
func makeUIView(context: Context) -> UIView {
let view = UIView(frame: UIScreen.main.bounds)
let previewLayer = AVCaptureVideoPreviewLayer(session: camera.session)
previewLayer.videoGravity = .resizeAspectFill
previewLayer.connection?.videoOrientation = .portrait
view.layer.addSublayer(previewLayer)
DispatchQueue.global(qos: .background).async {
previewLayer.frame = view.frame
}
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
// Update the preview layer frame if needed
if let sublayer = uiView.layer.sublayers?.first as? AVCaptureVideoPreviewLayer {
sublayer.frame = uiView.frame
}
}
}
Explanation:
- CameraPreview is a SwiftUI view that wraps a
UIView
displaying the camera’s video feed. - It observes the
CameraModel
to stay in sync with the camera session.
c. Building the User Interface
Update ContentView.swift
with the camera interface:
import SwiftUI
struct ContentView: View {
@ObservedObject var camera = CameraModel()
@State private var showPreview = false
var body: some View {
ZStack {
CameraPreview(camera: camera)
.ignoresSafeArea(.all, edges: .all)
VStack {
Spacer()
HStack {
Spacer()
Button(action: {
camera.takePic()
}) {
ZStack {
Circle()
.fill(Color.white)
.frame(width: 70, height: 70)
Circle()
.stroke(Color.gray, lineWidth: 2)
.frame(width: 80, height: 80)
}
}
.padding(.bottom, 30)
.padding(.trailing, 20)
}
}
}
.alert(isPresented: $camera.alert) {
Alert(title: Text("Camera Access Denied"),
message: Text("Please enable camera access in Settings."),
dismissButton: .default(Text("OK")))
}
.sheet(isPresented: Binding<Bool>(
get: { camera.preview != nil },
set: { if !$0 { camera.preview = nil } }
)) {
if let image = camera.preview {
Image(uiImage: image)
.resizable()
.scaledToFit()
}
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
Explanation:
- The
ZStack
layers the camera preview and the capture button. - A circular button at the bottom center allows the user to take a photo.
- An alert is presented if camera access is denied.
- After capturing a photo, a sheet displays the captured image.
3. Final Touches
a. Handling Photo Library Permissions
If you added the NSPhotoLibraryAddUsageDescription
to Info.plist
, ensure your app handles cases where the user denies access gracefully.
b. Testing on a Real Device
Camera functionality cannot be fully tested on the simulator. Deploy the app to a physical iPhone for testing:
- Connect your iPhone to your Mac.
- Select your device as the run destination in Xcode.
- Click the Run button to build and install the app on your device.
- Test capturing photos and ensure they save to the photo library.
c. App Store Requirements
Before submitting to the App Store, ensure:
- App Icons and Launch Screens: Provide appropriate app icons and launch images.
- Privacy Policies: If your app uses user data, include a privacy policy.
- App Store Metadata: Prepare app descriptions, screenshots, and other required metadata.
- Compliance: Ensure your app complies with all App Store Review Guidelines.
4. Complete Code Overview
For convenience, here’s a summary of all the code files:
a. CameraModel.swift
import Foundation
import AVFoundation
import SwiftUI
import Photos
class CameraModel: NSObject, ObservableObject, AVCapturePhotoCaptureDelegate {
@Published var session = AVCaptureSession()
@Published var alert = false
@Published var output = AVCapturePhotoOutput()
@Published var preview: UIImage?
override init() {
super.init()
checkPermissions()
}
func checkPermissions() {
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized:
setup()
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setup()
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
default:
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
func setup() {
DispatchQueue.main.async {
self.session.beginConfiguration()
guard let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
self.alert.toggle()
return
}
do {
let input = try AVCaptureDeviceInput(device: device)
if self.session.canAddInput(input) {
self.session.addInput(input)
}
} catch {
self.alert.toggle()
return
}
if self.session.canAddOutput(self.output) {
self.session.addOutput(self.output)
self.output.isHighResolutionCaptureEnabled = true
}
self.session.commitConfiguration()
self.session.startRunning()
}
}
func takePic() {
let settings = AVCapturePhotoSettings()
settings.isHighResolutionPhotoEnabled = true
output.capturePhoto(with: settings, delegate: self)
}
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
if let error = error {
print("Error capturing photo: \(error)")
return
}
guard let imageData = photo.fileDataRepresentation(),
let uiImage = UIImage(data: imageData) else {
return
}
savePhoto(image: uiImage)
DispatchQueue.main.async {
self.preview = uiImage
}
}
func savePhoto(image: UIImage) {
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if let error = error {
print("Error saving photo: \(error)")
}
}
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
}
}
b. CameraPreview.swift
import SwiftUI
import AVFoundation
struct CameraPreview: UIViewRepresentable {
@ObservedObject var camera: CameraModel
func makeUIView(context: Context) -> UIView {
let view = UIView(frame: UIScreen.main.bounds)
let previewLayer = AVCaptureVideoPreviewLayer(session: camera.session)
previewLayer.videoGravity = .resizeAspectFill
previewLayer.connection?.videoOrientation = .portrait
view.layer.addSublayer(previewLayer)
DispatchQueue.global(qos: .background).async {
previewLayer.frame = view.frame
}
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
if let sublayer = uiView.layer.sublayers?.first as? AVCaptureVideoPreviewLayer {
sublayer.frame = uiView.frame
}
}
}
c. ContentView.swift
import SwiftUI
struct ContentView: View {
@ObservedObject var camera = CameraModel()
@State private var showPreview = false
var body: some View {
ZStack {
CameraPreview(camera: camera)
.ignoresSafeArea(.all, edges: .all)
VStack {
Spacer()
HStack {
Spacer()
Button(action: {
camera.takePic()
}) {
ZStack {
Circle()
.fill(Color.white)
.frame(width: 70, height: 70)
Circle()
.stroke(Color.gray, lineWidth: 2)
.frame(width: 80, height: 80)
}
}
.padding(.bottom, 30)
.padding(.trailing, 20)
}
}
}
.alert(isPresented: $camera.alert) {
Alert(title: Text("Camera Access Denied"),
message: Text("Please enable camera access in Settings."),
dismissButton: .default(Text("OK")))
}
.sheet(isPresented: Binding<Bool>(
get: { camera.preview != nil },
set: { if !$0 { camera.preview = nil } }
)) {
if let image = camera.preview {
Image(uiImage: image)
.resizable()
.scaledToFit()
}
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
5. Building and Submitting to the App Store
a. Test Thoroughly
Ensure all functionalities work as expected on various devices and iOS versions.
b. App Store Submission
- Archive the App: In Xcode, select Product > Archive.
- Validate the Archive: Use Xcode’s validation process to check for issues.
- Upload to App Store Connect: Follow Xcode prompts to upload your app.
- Configure App Store Connect: Fill in app details, screenshots, and submit for review.
For detailed instructions, refer to Apple’s App Store Connect Guide.
By following this guide, you’ve created a minimalist camera app using Swift and SwiftUI that’s ready for submission to the App Store. Customize and enhance the app as needed to fit your specific requirements!
xxx
To modify the camera app to shoot and output only high-contrast monochrome (black and white) photos, you will need to process the captured image after taking the photo. Here’s how to implement this:
Steps:
- Capture the Photo: As we already capture the photo using the
AVCapturePhotoOutput
in our existing code, we’ll keep this. - Convert the Image to Monochrome: Use a Core Image filter (
CIColorControls
for desaturation and contrast adjustments). - Apply High Contrast: You can use
CIColorControls
to increase contrast and decrease brightness if needed. - Update the UI to Display the Monochrome Photo.
Let’s modify the existing code to achieve this.
1. Modifying CameraModel.swift
You will process the captured image to convert it to monochrome with high contrast using Core Image filters.
Updated CameraModel.swift
import Foundation
import AVFoundation
import SwiftUI
import Photos
import CoreImage
import CoreImage.CIFilterBuiltins
class CameraModel: NSObject, ObservableObject, AVCapturePhotoCaptureDelegate {
@Published var session = AVCaptureSession()
@Published var alert = false
@Published var output = AVCapturePhotoOutput()
@Published var preview: UIImage?
private let context = CIContext() // Core Image context for image processing
override init() {
super.init()
checkPermissions()
}
func checkPermissions() {
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized:
setup()
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setup()
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
default:
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
func setup() {
DispatchQueue.main.async {
self.session.beginConfiguration()
// Setup input
guard let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
self.alert.toggle()
return
}
do {
let input = try AVCaptureDeviceInput(device: device)
if self.session.canAddInput(input) {
self.session.addInput(input)
}
} catch {
self.alert.toggle()
return
}
// Setup output
if self.session.canAddOutput(self.output) {
self.session.addOutput(self.output)
self.output.isHighResolutionCaptureEnabled = true
}
self.session.commitConfiguration()
self.session.startRunning()
}
}
func takePic() {
let settings = AVCapturePhotoSettings()
settings.isHighResolutionPhotoEnabled = true
output.capturePhoto(with: settings, delegate: self)
}
// AVCapturePhotoCaptureDelegate method
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
if let error = error {
print("Error capturing photo: \(error)")
return
}
guard let imageData = photo.fileDataRepresentation(),
let uiImage = UIImage(data: imageData) else {
return
}
// Apply monochrome and contrast processing
if let processedImage = applyMonochromeFilter(to: uiImage) {
// Save the processed image to the photo library
savePhoto(image: processedImage)
DispatchQueue.main.async {
self.preview = processedImage
}
}
}
// Apply monochrome filter and adjust contrast
func applyMonochromeFilter(to image: UIImage) -> UIImage? {
guard let ciImage = CIImage(image: image) else { return nil }
// Create a Core Image filter chain
let monochromeFilter = CIFilter.colorControls()
monochromeFilter.inputImage = ciImage
monochromeFilter.saturation = 0 // Desaturate to monochrome
monochromeFilter.contrast = 2.0 // Increase contrast for a high contrast effect
// Process the filtered image
guard let outputImage = monochromeFilter.outputImage,
let cgImage = context.createCGImage(outputImage, from: outputImage.extent) else {
return nil
}
// Convert back to UIImage
return UIImage(cgImage: cgImage)
}
func savePhoto(image: UIImage) {
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if let error = error {
print("Error saving photo: \(error)")
}
}
} else {
DispatchQueue.main.async {
self.alert.toggle()
}
}
}
}
}
Key Modifications:
- Monochrome Filter: We use
CIColorControls
withsaturation = 0
to desaturate the image andcontrast = 2.0
for a high contrast effect. - Core Image Context: This context is used to process the image after applying the filter and convert it back to a
UIImage
. - Monochrome Image Output: The processed image is saved to the photo library and displayed on the UI.
2. The UI Remains Mostly the Same
Your ContentView.swift
and CameraPreview.swift
can remain the same as they handle the UI for previewing and capturing the image. Since we are processing the image in the model, the UI doesn’t need to change much.
However, just to complete the context, here’s the ContentView.swift
again for reference:
import SwiftUI
struct ContentView: View {
@ObservedObject var camera = CameraModel()
@State private var showPreview = false
var body: some View {
ZStack {
CameraPreview(camera: camera)
.ignoresSafeArea(.all, edges: .all)
VStack {
Spacer()
HStack {
Spacer()
Button(action: {
camera.takePic()
}) {
ZStack {
Circle()
.fill(Color.white)
.frame(width: 70, height: 70)
Circle()
.stroke(Color.gray, lineWidth: 2)
.frame(width: 80, height: 80)
}
}
.padding(.bottom, 30)
.padding(.trailing, 20)
}
}
}
.alert(isPresented: $camera.alert) {
Alert(title: Text("Camera Access Denied"),
message: Text("Please enable camera access in Settings."),
dismissButton: .default(Text("OK")))
}
.sheet(isPresented: Binding<Bool>(
get: { camera.preview != nil },
set: { if !$0 { camera.preview = nil } }
)) {
if let image = camera.preview {
Image(uiImage: image)
.resizable()
.scaledToFit()
}
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
3. Test the App
Make sure to test the app on a real device (since the camera doesn’t work on the simulator). The app should now capture and output photos that are high-contrast black-and-white images.
4. Summary of Changes:
- High-Contrast Monochrome Processing: We’ve added image processing to apply a monochrome filter with high contrast using
Core Image
. - Filter Chain:
CIColorControls
is used to control the saturation and contrast to achieve the desired effect.
By following these steps, your minimalist camera app will now take high-contrast black-and-white photos and save them to the photo library.