iOS, Android 2018–2019

Angelcam INC

Feeling safe is simply essential

1 year, 5 500+ hours

01 Home

Angelcam is a start-up which adds a community element to household security. Its solution aims at preventing false alarms caused, for example, by pets. Before setting the alarm off, the user or any assigned “angel” receives a notification.

Angelcam is a start-up which adds a community element to household security. Its solution aims at preventing false alarms caused, for example, by pets. Before setting the alarm off, the user or any assigned “angel” receives a notification.

Client:

Angelcam INC

System:

iOS & Android

Year:

2018–2019

Difficulty:

+ 5 500 hours

Expert modeBasic mode

02 Assignment

All new

All new

Big things don’t happen without big changes.

The app had ambitious plans and its solution at the time had reached its limits, both in terms of design and technology. Therefore, it was decided to completely redo the app to facilitate the implementation of future security-technology elements and to significantly improve user comfort.

Concept

Design

UX

Development

Deployment

Service

To create Android-/iOS-native mobile apps that are ready for further development by the in-house team.

This meant we didn’t use proprietary libraries and the entire code came from the client.

Concept

Design

UX

Development

Deployment

Service

Dashboard
My profile
My cameras
My notifications
Events
Camera sharing

03 Overcoming obstacles

How it works

  • 01
  • 02
  • 03
  • 04

Video streaming is always a challenge with desktop computers, and with mobile apps it requires even more superior skills. The app uses two video formats that dynamically adjust to given conditions.Synchronizing the app with the existing backend solution was another thing to deal with.The Angelcam environment is all about the timeline, so our task was to ensure synchronizing with a virtually endless stream of shots from security cameras.The assignment required that the timeline be controlled using several gestures; implementing multiple gestures within a single element isn’t common practice.

01020304

04 Highlights

How we resolved it

How we resolved it

01

Video stream

In order to ensure the best quality and reliability, we implemented the video stream through the HTTP Live Streaming protocol. Whenever speed was needed and low demands or quality weren’t among the main criteria (e.g. previews), we used MJPEG video stream. Additionally, MJPEG is used when the HLS stream isn’t available and the app can resolve this on its own.

02

Stream status detection

Additionally, we implemented stream status detection, letting the user keep track of potential blackouts, as well.

03

Events on timeline

Every time the timeline moves, it is synchronized with the currently playing stream. The timeline was divided by day and also displays all events in a well-arranged way, like showing important events from motion detectors, for instance.

04

Simple control

We made it possible to control the timeline in a way many users are accustomed to working with photos.

01

Timeline

We created a timeline which reflects a single day, together with points in time where the user simply searches for an event. Also, the user can create and share a video using a link.

02

REST API

Communication with the server was implemented based on the existing REST API, which prevented us from carrying out any major modifications.

03

HTTPS Live Streaming

The more demanding video stream is ensured via HLS (HTTP Live Streaming). This serves for limited or unlimited [KK2] video stream and can adjust the flow of data based on current network capacity. The video is divided into small segments, e.g. of 5 seconds. This playlist (or rather a small buffer of this playlist) is then gradually played.

04

Motion JPEG

MJPEG (Motion JPEG) is used for previews or as a back-up, which was already implemented in the original app. We took this solution over and improved it in order to – among other things – control the quality and demands of the stream and improve our reactions to bugs.

05

Own implementation of motion in the video recording

Given the previous choices and the almost-finished API, we had to choose our own implementation of motion in the video recording (play/pause, seek and time selection) as well as the calculation of the amount of video which had to be synchronized with the server, as the server automatically skips missing recordings in the cloud. [KK3] The player available from Google or iOS was used to play the video recording or live video. The list of cameras was implemented using an MJPEG stream for each camera, and is regulated based on the type of connection and by user-selected display.

05 Results

Modern, fast app

Modern, fast app

The result is a modern, fast app, fully ready for the additional features that are already in production.

Its new graphical and control elements make it easy to use and improve user comfort.

Android is implemented using the MVVM architecture including our framework and the Kotlin language.

When developing the iOS app, we made use of our open-source frameworks. FuntastyKit to support the MVVM-C architecture, CellKit to make our work with tables easier, and FTAPIKit, enabling a declarative and well-arranged administration of endpoints. This enables us to hand over the app to the in-house team anytime on request. Of course, the Secure Enclave was used for secure and encrypted storage of access data.

6 500

active montly users

45 000

cameras

610TB

data in Amazon cloud and still growing

Your app can be successful too
Contact

Next App