Creating a minimalist iPhone camera app in Swift requires setting up the basic camera functionalities using AVFoundation
and providing a simple user interface. I will guide you through building a minimal camera app that opens the camera, allows users to take photos, and save them to their photo library.
1. Project Setup
Open Xcode and create a new project:
- Open Xcode.
- Create a new project: File -> New -> Project.
- Select “App” under iOS.
- Name the project (e.g., “MinimalistCameraApp”), set Swift as the language, and SwiftUI or Storyboard as the interface.
- Make sure to enable the “Use Core Data” and “Include Tests” options if needed. For this minimal project, we won’t need them.
2. App Permissions
To access the camera, you need to add the appropriate permissions in your Info.plist
.
- Open
Info.plist
and add the following keys:
<key>NSCameraUsageDescription</key>
<string>We need access to your camera to take photos.</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>We need access to save photos to your library.</string>
3. Creating the Camera View
Below is a Swift code for a minimalist camera app using UIKit
and AVFoundation
:
ViewController.swift
import UIKit
import AVFoundation
import Photos
class ViewController: UIViewController {
var captureSession: AVCaptureSession!
var photoOutput: AVCapturePhotoOutput!
var previewLayer: AVCaptureVideoPreviewLayer!
override func viewDidLoad() {
super.viewDidLoad()
// Initialize capture session and setup camera
captureSession = AVCaptureSession()
captureSession.sessionPreset = .photo
guard let backCamera = AVCaptureDevice.default(for: .video) else {
print("Unable to access back camera!")
return
}
do {
let input = try AVCaptureDeviceInput(device: backCamera)
photoOutput = AVCapturePhotoOutput()
if captureSession.canAddInput(input) && captureSession.canAddOutput(photoOutput) {
captureSession.addInput(input)
captureSession.addOutput(photoOutput)
setupLivePreview()
}
} catch let error {
print("Error Unable to initialize back camera: \(error.localizedDescription)")
}
}
func setupLivePreview() {
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = .resizeAspect
previewLayer.connection?.videoOrientation = .portrait
previewLayer.frame = view.frame
view.layer.insertSublayer(previewLayer, at: 0)
// Start capture session
DispatchQueue.global(qos: .userInitiated).async {
self.captureSession.startRunning()
}
// Add button to take photo
let captureButton = UIButton(type: .system)
captureButton.setTitle("Take Photo", for: .normal)
captureButton.tintColor = .white
captureButton.backgroundColor = .black
captureButton.frame = CGRect(x: (self.view.frame.width - 150) / 2, y: self.view.frame.height - 80, width: 150, height: 50)
captureButton.layer.cornerRadius = 10
captureButton.addTarget(self, action: #selector(didTapTakePhoto), for: .touchUpInside)
view.addSubview(captureButton)
}
@objc func didTapTakePhoto() {
let settings = AVCapturePhotoSettings()
photoOutput.capturePhoto(with: settings, delegate: self)
}
func savePhotoToLibrary(image: UIImage) {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if success {
print("Photo saved successfully")
} else {
print("Error saving photo: \(String(describing: error))")
}
}
}
}
extension ViewController: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let imageData = photo.fileDataRepresentation() else { return }
let image = UIImage(data: imageData)
if let image = image {
savePhotoToLibrary(image: image)
}
}
}
4. Explanation of the Code:
- AVCaptureSession: Initializes the session to handle the camera feed.
- AVCaptureDevice: Accesses the back camera.
- AVCapturePhotoOutput: Handles capturing the photo.
- AVCaptureVideoPreviewLayer: Displays the camera’s live feed on the screen.
- didTapTakePhoto: Triggered when the user taps the “Take Photo” button, it captures the photo.
- photoOutput: Processes the captured photo and saves it to the photo library using
PHPhotoLibrary
.
5. User Interface
The app has a minimalist interface:
- A live preview of the camera using
AVCaptureVideoPreviewLayer
. - A simple button labeled “Take Photo” centered at the bottom of the screen.
You can customize the button, layout, and add animations if needed, but this is the basic structure of a working camera app.
6. Test on a Real Device
To use the camera, you need to run the app on a real device, as the iOS simulator doesn’t have access to a camera.
7. Prepare for the App Store
Once you’ve built, tested, and finalized your app:
- Make sure to follow Apple’s Human Interface Guidelines to ensure the design meets their expectations.
- Test thoroughly, especially the permissions (camera, photo library).
- In Xcode, go to Product -> Archive to create an archive for your app.
- Submit the app via Xcode to App Store Connect, where you can manage your App Store listing, set up pricing, and submit for review.
This app should meet the basic requirements for a minimalist camera app and can be submitted to the App Store after proper testing.
Let me know if you’d like further details or enhancements!
Also with function to only shoot and output high contrast black and white photos
To modify the existing minimalist camera app to shoot and output high-contrast black-and-white photos, you can apply an image filter after capturing the photo. We will use Core Image (CoreImage
) to process the photo and apply a black-and-white filter with enhanced contrast.
Steps to Modify the Camera App for Black-and-White Photos:
- Import Core Image Framework:
ImportCoreImage
into yourViewController.swift
file, which will allow us to apply filters to the captured images. - Add Black-and-White Filter:
After the photo is captured, we will use theCIFilter
class to convert the image to grayscale and enhance the contrast.
Here’s how you can modify the app to shoot and output high-contrast black-and-white photos:
Modified ViewController.swift
:
import UIKit
import AVFoundation
import Photos
import CoreImage
class ViewController: UIViewController {
var captureSession: AVCaptureSession!
var photoOutput: AVCapturePhotoOutput!
var previewLayer: AVCaptureVideoPreviewLayer!
let context = CIContext() // Core Image context for rendering
override func viewDidLoad() {
super.viewDidLoad()
// Initialize capture session and setup camera
captureSession = AVCaptureSession()
captureSession.sessionPreset = .photo
guard let backCamera = AVCaptureDevice.default(for: .video) else {
print("Unable to access back camera!")
return
}
do {
let input = try AVCaptureDeviceInput(device: backCamera)
photoOutput = AVCapturePhotoOutput()
if captureSession.canAddInput(input) && captureSession.canAddOutput(photoOutput) {
captureSession.addInput(input)
captureSession.addOutput(photoOutput)
setupLivePreview()
}
} catch let error {
print("Error Unable to initialize back camera: \(error.localizedDescription)")
}
}
func setupLivePreview() {
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = .resizeAspect
previewLayer.connection?.videoOrientation = .portrait
previewLayer.frame = view.frame
view.layer.insertSublayer(previewLayer, at: 0)
// Start capture session
DispatchQueue.global(qos: .userInitiated).async {
self.captureSession.startRunning()
}
// Add button to take photo
let captureButton = UIButton(type: .system)
captureButton.setTitle("Take Photo", for: .normal)
captureButton.tintColor = .white
captureButton.backgroundColor = .black
captureButton.frame = CGRect(x: (self.view.frame.width - 150) / 2, y: self.view.frame.height - 80, width: 150, height: 50)
captureButton.layer.cornerRadius = 10
captureButton.addTarget(self, action: #selector(didTapTakePhoto), for: .touchUpInside)
view.addSubview(captureButton)
}
@objc func didTapTakePhoto() {
let settings = AVCapturePhotoSettings()
photoOutput.capturePhoto(with: settings, delegate: self)
}
func savePhotoToLibrary(image: UIImage) {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if success {
print("Photo saved successfully")
} else {
print("Error saving photo: \(String(describing: error))")
}
}
}
// Apply black-and-white filter with high contrast
func applyHighContrastBlackAndWhiteFilter(to image: UIImage) -> UIImage? {
guard let ciImage = CIImage(image: image) else { return nil }
// Create grayscale filter
let grayscaleFilter = CIFilter(name: "CIPhotoEffectNoir")
grayscaleFilter?.setValue(ciImage, forKey: kCIInputImageKey)
// Apply the grayscale filter
guard let filteredCIImage = grayscaleFilter?.outputImage else { return nil }
// Render the output using the Core Image context
if let cgImage = context.createCGImage(filteredCIImage, from: filteredCIImage.extent) {
return UIImage(cgImage: cgImage)
}
return nil
}
}
extension ViewController: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let imageData = photo.fileDataRepresentation() else { return }
let image = UIImage(data: imageData)
// Apply black-and-white filter with high contrast
if let image = image, let bwImage = applyHighContrastBlackAndWhiteFilter(to: image) {
// Save the filtered image to the library
savePhotoToLibrary(image: bwImage)
}
}
}
3. Explanation of the Added Changes:
- CIContext: We use
CIContext()
to create a context for rendering the Core Image filters. - Grayscale Filter (CIPhotoEffectNoir):
We apply aCIPhotoEffectNoir
filter to convert the image to black and white with high contrast. This is a built-in filter in Core Image specifically for applying a noir-like high-contrast black-and-white effect. - applyHighContrastBlackAndWhiteFilter:
- This function takes the captured
UIImage
as input. - It converts the
UIImage
to aCIImage
(Core Image format). - The
CIPhotoEffectNoir
filter is applied to the image, turning it into a high-contrast black-and-white photo. - The filtered
CIImage
is then converted back to aUIImage
for saving and displaying. - Saving the Filtered Image:
- Once the image is processed with the black-and-white filter, it is passed to the
savePhotoToLibrary
function, which stores the processed image in the user’s photo library.
4. Test and Adjust:
- The app now captures photos, converts them to black-and-white with high contrast, and saves the processed image to the photo library.
- Test the app on a real device to ensure the filters are applied correctly, and the images are saved.
5. Customization (Optional):
If you want to adjust the contrast even more or fine-tune the effect, you can chain multiple Core Image filters (e.g., CIExposureAdjust
or CIColorControls
for contrast, brightness, etc.).
Here is how you could apply an additional contrast filter:
func applyHighContrastBlackAndWhiteFilter(to image: UIImage) -> UIImage? {
guard let ciImage = CIImage(image: image) else { return nil }
// Apply grayscale filter
let grayscaleFilter = CIFilter(name: "CIPhotoEffectNoir")
grayscaleFilter?.setValue(ciImage, forKey: kCIInputImageKey)
// Get the filtered image
guard let grayscaleImage = grayscaleFilter?.outputImage else { return nil }
// Create contrast adjustment filter
let contrastFilter = CIFilter(name: "CIColorControls")
contrastFilter?.setValue(grayscaleImage, forKey: kCIInputImageKey)
contrastFilter?.setValue(1.5, forKey: kCIInputContrastKey) // Adjust contrast (1.0 is default)
// Apply the contrast filter and render the image
if let outputImage = contrastFilter?.outputImage,
let cgImage = context.createCGImage(outputImage, from: outputImage.extent) {
return UIImage(cgImage: cgImage)
}
return nil
}
In this enhanced function, we adjust the contrast using CIColorControls
after applying the CIPhotoEffectNoir
filter.
6. Conclusion:
This code should now output high-contrast black-and-white photos when you take a picture. It keeps the app simple, but it adds the necessary image processing step to achieve the desired effect. Once you have finished testing, follow the same process as before to prepare the app for submission to the App Store.
Let me know if you need further refinements!