Vuforia SDK + remote video streaming on iOS

I recently have undertaken a project on iOS that requires integration with the Vuforia SDK. It’s an augmented reality proprietary framework, built for iOS and Android and has been very popular due to its innovative recognition library. One of the coolest demos that are appealing to advertisers or people looking to incorporate commercial campaigns inside their applications concerns the ability to play a video on top of a target. Vuforia even provides a sample application for that. However,  remote video streaming on texture does not work on textures.

This is a long standing issue, with people on the forums asking for a solution, some providing either free solutions which are outdated and / or non-performant, or paid solutions that are very expensive.

The video-on-texture-rendering process in general

In order to stream a video onto an OpenGL texture, the following actions must happen:

  1. Initialise a renderable surface. This is a once-per-session operation.
  2. Create a texture with an ID.
  3. Assign the texture to this surface (applying shaders, etc, etc).
  4. On each render, apply transformations to the renderable surface, according to the coordinates of the recognised object.
  5. Get the video bytes and decode them to get actual video data.
  6. Convert the video data to OpenGL data (ready to be drawn)
  7. Apply those video data to the texture you have gained from step 2.

Steps 1 – 3 are already happening inside Vuforia’s sample. Step 4 is also the reason why the vuforia SDK exists; to give you the transformation and coordinates of a recognised object inside the world coordinate space. Therefore, step 4 is also included in the sample (and all samples from vuforia).

The difficult part, and the part that Vuforia SDK is not responsible for, is step 5 – 7. This is where we, the third party developers come into play.

The actual problem with Vuforia’s sample:

As I have already mentioned, Vuforia’s SDK is only responsible for recognising an object into world space, and providing you with its coordinate and transform. What you do with these information is up to you. Therefore, Vuforia’s VideoPlayback sample should be taken as a demonstration of what you can do with it, not see its limitations.

Inside the sample Vuforia makes heavy use of AVAssetReader and AVAssetReaderOutput in order to perform the following actions. As many people have already pointed out in the forums, AVAssetReader is responsible for reading from local file URLs, and does not support remote files.  So, Step 5 in the video-on-texture-rendering is the problematic one, as you need to decode the video data you get from a remote location to actual OpenGL data, and then render those data on-screen. Many people have said in the forums that remote on-texture rendering is not possible on iOS.

This couldn’t be further from the truth.

The solution

What we need to do is to get the actual OpenGL data ready to be rendered, and apply those data as a texture onto the texture created by Vuforia. The SDK and the sample have already created an OpenGL coordinate system, so all that’s left is to get the OpenGL data, and divert the data flow from the original sample code.

Instead of using AVAssetReader, we are going to use AVPlayerItemVideoOutput, which was introduced in iOS 6. This class has the method – copyPixelBufferForItemTime:itemTimeForDisplay: , which is exactly the one that we want to use, in order to get the raw OpenGL data to render on the texture.

The following code samples are intended to replace / update the corresponding functionality on Vuforia’s VideoPlayback sample. The code can certainly be improved.

First, let’s set up the the video player, and the video output item, in order to later extract the video buffer contents.

After each frame is called, the -updateVideoData is responsible of preparing the video data for display. The following code is a  modified sample code from Vuforia, and uses -copyPixelBufferForItemTime:itemTimeForDisplay: in order to extract the streamed video content, and bind it with the OpenGL texture that is being rendered at this point.

There are also some other interventions we must do, in order to change the setup of the video output.

Those are all the changes one can do in order to setup the video playback and render to texture the video streamed. However, Vuforia’s sample must also be updated in many areas in order to understand that now remote videos CAN be played.

 

That’s it! You may need to do some more minor changes, but this is the general concept in order to make the tutorial run. This methodology has been tested with Vuforia 4.0, and works perfectly (and is also used in an application released to the app store)

Want the full source?

Before you download the source, please understand that there are many optimisations to be made to the example. Vuforia’s example is constructed to support iOS 4, and as such, if you target iOS 6 and later, you can get rid at least half of the code, you can convert the project to ARC (which is certainly advised), and you can also optimise the video playback to use hardware acceleration. I have implemented all of these functionalities to my released applications, however, it would be confusing writing a tutorial here that would cope with many problems at once.

Grab the source here!

Edit 2017-06-14

I haven’t touched Vuforia’s SDK for a while. I will not be able to offer support with the newer SDK versions.

I have received some comments mentioning that this solution does not work with the newest versions of the Vuforia SDK.

The point is, however, is that it is impossible by definition for this solution to not work. Maybe the code will change, but the solution is valid, since it features streaming a video from a video source, decoding the data to OpenGL using CMSampleBufferGetImageBuffer(), and rendering the OpenGL data to a texture.

I’m sorry for the fact I am not able to offer code-level support for this solution but the methodology is still valid. It requires a little bit of effort and OpenGL knowledge, but it’s definitely doable.

  • Mohd Fadzley Hud

    Hi,

    Can you advise how to turn this into cloud recognition app?

    thanks

    • csotiriou

      It’s been a while since I have touched this subject, but I remember some things. Is there any specific point where you need help, or you need general guidance?

      • Mohd Fadzley Hud

        Thank you for your reply. basically I’m stuck at getting the target images and metadata which contains the video url from the cloud database. I already made some modification to the source code referring to Vuforia cloud book sample but not successful. I think the sample too advance for me 🙂

        • csotiriou

          I can give you a sample piece of code here:

          https://gist.github.com/csotiriou/238fdf8f1d130c6013f9

          Basically, all you need to do is initialise the cloud tracker instead of the local one. Then, when you have successfully recognised something, the metadata will be accessible through the resulting trackable.

          • Mohd Fadzley Hud

            Hi, Thank you for the sample code. It clear out couples of things for me. BTW onInitARDone was in the middle of doInitTrackers. Based on your code, I’m working on my own reco now. Thanks again for your help.

  • indiedevdad

    Hi,

    I need to recreate this solution with Unity, please can you have any suggestions?
    Thanks

  • indiedevdad

    Hi,

    I just run your solution but the videos not starting, always the X icon on it. I saw in the code these statements below, seems correct, so please can you help me? I need to see play a video on texture not fullscreen os iOS.
    Many thanks

    // Start video playback from the current position (the beginning) on the
    // first run of the app
    for (int i = 0; i < NUM_VIDEO_TARGETS; ++i) {
    videoPlaybackTime[i] = VIDEO_PLAYBACK_CURRENT_POSITION;
    }

    // For each video-augmented target
    for (int i = 0; i < NUM_VIDEO_TARGETS; ++i) {
    // Load a local file for playback and resume playback if video was
    // playing when the app went into the background
    VideoPlayerHelper* player = [self getVideoPlayerHelper:i];
    NSString* filename;

    switch (i) {
    case 0:
    filename = @"VuforiaSizzleReel_1.m4v";
    break;
    default:
    filename = @"VuforiaSizzleReel_2.m4v";
    break;
    }

    if (NO == [player load:@"http://vjs.zencdn.net/v/oceans.mp4&quot; playImmediately:NO fromPosition:videoPlaybackTime[i]]) {
    NSLog(@"Failed to load media");
    }
    }

  • Kenneth Paulino

    Great Fix for Vuforia 4. Does this apply to Vuforia 5 or should we continue to use Vuforia 4 if we want streaming video on a texture? Thanks for you work on this.

    • csotiriou

      Yes, this applies to Vuforia 5 as well.

  • Simon Canil

    Hi there – similar to indiedevdad’s comment, I was unable to play the video’s due to the same issue using xcode 7.1. All movie preview views were shown with a cross indicating an error caught @ line 246 VideoPlayerHelper.mm

  • Simon Canil
    • csotiriou

      Yes, that is indeed the issue. I will post it as an answer to the other guy asking the same thing.

  • Simon Canil

    thanks for a great sample – I’d be very keen to learn of the improvements that you discuss!

  • Fabriccio De la Mora

    Hi! First of all, thanks for the example, it has helped me a lot!

    I am working on a project that involves adding a bigger list of links, all to be played remotely

    So, what I did was, I incremented the number of video and augmentation textures like this

    #define NUM_AUGMENTATION_TEXTURES 27

    #define NUM_VIDEO_TARGETS 24

    Then, I created a .xml file from the Vuforia Library and added it to my project, and then I just conditioned the triggers to point to different links

    int playerIndex = 0;

    if (strcmp(imageTarget.getName(), “01”) == 0)

    {

    playerIndex = 0;

    } else if (strcmp(imageTarget.getName(), “03”) == 0)

    {

    playerIndex = 1;

    } else if (strcmp(imageTarget.getName(), “04”) == 0)

    {

    playerIndex = 2;

    } else if (strcmp(imageTarget.getName(), “05”) == 0)

    {

    playerIndex = 3;

    }

    switch (i) {

    case 0:

    filename = @”01.mp4″;

    break;

    case 1:

    filename = @”03.mp4″;

    break;

    case 2:

    filename = @”04.mp4″;

    break;

    case 3:

    filename = @”05.mp4″;

    break;

    if (NO == [player load:[NSString stringWithFormat:@”http://xxxxxxx.com/%@”, filename] playImmediately:NO fromPosition:videoPlaybackTime[i]]) {

    NSLog(@”Failed to load media”);

    }

    The thing is, it sometimes plays them all, and sometimes it just shows the X icon on some videos. I wonder if it has something to do with the fact that’s a lot of triggers

    Could you help me?

  • sindhu

    i have implemented the same using the updated vuforia SDK,but i didn’t get it.I need videos from random url to be played on texture.
    Found many threads but none of them helped me.

    Please do share the code with updated vuforia SDK

    • csotiriou

      I have successfully released an application using Vuforia SDK 5.0 using this methodology.

      What problem are you facing?

      • sindhu

        I am having linkage issues.Since QCAR is not valid in vuforia 5.I changed in into Vuforia, but is of no use.Googled and replaced build folder but none of it helped.Do u have any approach for this.

        • csotiriou

          Νο, this is actually a compilation issue. Linkage issues come after the compilation phase. This link describes the process of migrating from Vuforia 4 to Vuforia 5. I followed this in my project and it worked.

          https://developer.vuforia.com/library/articles/Solution/How-To-Migrate-an-iOS-SDK-project

          The missing headers are something different, so is your project setup. When updating libraries, you should also change your Headers Search Paths to the proper locations (Xcode knowledge required), Same goes with the update to Vuforia 5.

          • sindhu

            1)I can make your reference code run in Vuforia SDK 5 ,got an sample error 121.Is there any thing am missing
            2)changed loadmediafromURL,videouadate and prepare asset method as you have demonstrated above.I can see the image and playpause texture but unable to play a video.

            Is there any thing I am missing?

          • sindhu

            Hi!
            I have solved it.I have done exactly the same as i mentioned in point 2 above.Thank you! @csotiriou:disqus

  • Aldair Carrillo de Haro

    Have you tested this code with the new SDK 6?

  • Aldair Carrillo de Haro

    Hello, Have you tested this code with the new SDK 6.0?

    • csotiriou

      No, I haven’t it’s been a while since I have touched Vuforia.

    • Galih Suryo

      it works

      • Dil rock

        Hi Galih,
        can you pls share the working code with the sdk6.0 ??
        mail me to mainid007@gmail.com

        I tried so many times to convert this code but it never worked 🙁
        it would be really helpful.

        -Dilip

        • David Peñuela

          Hi Dil, could you share it also with me the code please? I can’t make it work…
          thanks!
          davidpenuelab@gmail.com

      • hepos

        Hi Galih! Do you have a working solution for video playback on texture with SDK 6.0? I really need to figure out how to get this to work, Would it possible for you to share the code or explain what changes you’ve made?

  • biyyala kamal

    Hi,

    I am unity3d developer. I am going to developer AR app with streaming videos(URL).
    I need to play multple video’s streaming . is it possible with unity3d?

    biyyala.kamalakar@credencys.com is my mail id

  • David Peñuela

    Hi,
    When i run your example in vuforia 4 (i have 4, 5 and 6, but wanted to test with 4 first) it crashes in
    – (BOOL)prepareAssetForPlayback
    {
    // Get video properties
    NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
    AVAssetTrack *videoTrack = videoTracks[0];

    because:
    ‘NSRangeException’, reason: ‘*** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array’
    *** First throw call stack:

    Any suggestion? could you share the app working with vuforia 5?
    Thanks @csotiriou:disqus

  • Kim Ska

    hi,
    Thanks for your great Help.
    but I have one problem.

    I tried your solution in Vuforia 6.0 sdk.
    but it not works. I can’t see video play button and videos.
    if I tap the screen, it can hear the sound.

    Have you succeeded on Vuforia 6.0 sdk?
    Can you help me?

  • Krunal Darji

    @csotiriou:disqus Tx for gr8 tutorial i am able to achieve this using latest Vuforia 6.2 sdk but one issue i can only hear voice of video texture not visible any suggestion ?

  • Mohit Tomar

    not working with vufora v6.2

    • csotiriou

      I edited my post. Please take a look at the bottom section.

  • Tati

    Tried this solution. It works. June 26… Xcode 8.3 iOS 10, Vuforia 6.2

    • Akshay Sharma

      hi please help me
      its not working only black screen comes with play and stop button

  • Mohit Tomar

    when i’m trying to play video with 3rd or 4th card. its giving me “No video sample buffer available” but play the audio, and show a stuck image of last played video.