Skip to content

Latest commit

 

History

History
11 lines (6 loc) · 1.1 KB

README.md

File metadata and controls

11 lines (6 loc) · 1.1 KB

LiveTextDemo

Last year, iOS 15 came with a very useful feature known as Live Text. You may have heard of the term OCR (short for Optical Character Recognition), which is the process for converting an image of text into a machine-readable text format. This is what Live Text is about.

Live Text is built-into the camera app and Photos app. If you haven't tried out this feature, simply open the Camera app. When you point the device's camera at an image of text, you will find a Live Text button at the lower-right corner. By tapping the button, iOS automatically captures the text for you. You can then copy and paste it into other applications (e.g. Notes).

This is a very powerful and convenient features for most users. As a developer, wouldn't it be great if you can incorporate this Live Text feature in your own app? In iOS 16, Apple released the Live Text API for developers to power their apps with Live Text. In this demo, let's see how to use the Live Text API with SwiftUI.

For the full tutorial, you can refer to this link:

https://www.appcoda.com/live-text-api