This blog post is the last in a series discussing an IP security camera system I built for my home. This post discusses the iOS and macOS apps that allow viewing of the video and motion events. The apps are written in Objective-C and Objective-C++. Here’s what is in the app (so far!):
The Internet communication to the server uses the gRPC system. gRPC takes files describing the data being sent to the server and generates code for both the server and the client. When a method is called on the stub on the client, a method in the server’s implementation object is called with the data passed to the stub. This simplifies the work of writing an Internet service. The gRPC system handles SSL encryption and various forms of authentication to protect the service on the server.
The main function of the app is a video player that plays the recorded video from the database. To play the video, first the timestamps have to be modified (see Managing h264 Presentation Timestamps (PTS)) then the AVFoundation AVSampleBufferDisplayLayer object is used to play the h264 video. This same object is available on macOS and iOS and it handles hardware-accelerated h264 video decoding.
Here’s what the iOS app looks like on an iPhone 7 simulator. This is the main screen the app launches to, which shows a live stream of the selected camera:
The timestamp at the bottom of the frame is based on the timestamp of the video segment from the database, plus the presentation timestamp of the currently displayed video frame. Zooming and panning the video is supported with touch gestures:
Here’s the date picker. It is displayed when the date on the screen is touched. It chooses whether to play back video from a specific date or to switch to live video:
The camera picker is displayed when the camera name is touched, and it makes it easy to change cameras:
Finally, the event picker, which shows all the times and locations when motion was detected. Touching an entry will start playing back video recorded at that time:
That’s it for now.