Problems, problems everywhere.

This week I haven’t made much progress ūüôĀ I have spent a lot of time trying to get the SURF algorithm running on the ios, but it is still very laggy! Maybe it’s because of my old iPhone 5, but my goal is to get it running on older devices starting from iPhone5. Therefore I will continue to debug it and try to find a way how to run it more smoothly on my device.

Despite bad luck in the implementation of the code, I have learned quite a bit about ios development and concepts such as view controllers.¬†View controllers in ios are the foundation of the app’s structure. Every app has one or several view controllers. Each of them manages a¬†portion of the user interface as well as interactions¬†between the interface and the data underneath.

The main class in ios view controllers is called UIViewController. Usually, we subclass this controller and add other required code to implement custom behaviour. That is exactly what I did in my app. As I’m

As I’m using openFrameworks for ios development, the UIViewController¬†is subclassed in¬†ofxiOSViewController which gives me access to most of the native functionality of the native ios. However, if I want to use methods offered by openFrameworks, such as drawing geometrical shapes, I have to create my own view controller. ¬†I did that in a class called “arAppViewController“. What it does is, it creates new view controller window and initiates¬†ofxiOSViewController¬†which then lets me use all of the openFrameworks functionality on an iOs device.

Currently, the usage of my application is very basic and it consists of the following actions:

  1. You launch the app by clicking the icon on the home screen;
  2. Loading screen comes up as the app is loading;
  3. When loading is finished, we get to the home screen of the app, which is a simple view controller screen with one button which says “tap to begin”.
  4. When you tap the button, AR marker image gets analysed in the backend of the app and the feed from the camera shows up on the screen.
  5. To begin tracking of the AR markers images you must tap and hold anywhere on the screen.
  6. This is where the app becomes very slow as it runs SURF algorithm every frame and tries to find detect and match features of the analysed image and the camera feed.

I have used debugger¬†tool on the Xcode, the IDE I’m using to develop this app, to see the performance parameters when running my app:

Just camera feed
Running algorithm to detect/match.

 

As you can see, CPU usage of the phone is quite high even when I don’t even use the algorithm, so I think I will have to get back to the step one and find a better way to capture the video from the camera. Also, we can see that CPU usage increases and the FPS drops significantly when we try to run the algorithm. My best guess is that I need to optimise the base app first so that the phone would have all that spare CPU and memory power to run the computer vision algorithms.

So this is where I am at the moment, and my situation hasn’t improved much since the last week. However, this week is the reading week, and I’m planning to spend at least¬†8hours a day debugging and fixing this issue so that I could progress further down on my project.

Tasks for the upcoming week:

  • Optimise the base app performance so that it would have more CPU power for the cv algorithms to run.
  • Write preliminary project report.