Category: music app

How to track Conversions for iOS App Sales

For the last version of Tessitura Pro 1.9.5, I created a short promo video and started a in-stream video campaign in adwords.

The most important aspect of a campaign is to be able to track conversions (how many views of the video turn into actual installations of the app)

Here are the steps I followed to make this happen.

  1. Create and Upload the video to YouTube
  2. Create a new Video campaign in google adwords
    ytc1

    Make sure you choose Mobile app installs and find your app using the search in the Your mobile app drop down search boxtcmo2.jpgOnce you’ve selected your app, new options will appear.
    jut5.jpg
    Choose the Bidding, etc…
    I chose a specific Advanced mobile and tablet options since I want the potential viewers of my promo video to buy the app directly from within the video (actually it will take them to the App Store but on the same device they are viewing the video)

    Then name your Ad group name for the campaign (you could have many different promo videos or different setting for the same video in your campaign, each of those would be an Ad group which is linked to one video).

    And search for your video on YouTube.
    hfyt6.jpg
    Once you’ve selected your video you will have more options to choose from.
    I am using an In-stream ad which will appear at the beginning of some other video, but you may choose a Video discovery ad type that will appear as a recommended video on some part of the screen depending on the device.

    You will also need to name this specific ad in the Ad name box (since you might want to have the same video showing as a different kind of ad for example)
    fhryhf33.jpg

    Then you will be shown the following page:
    fhy77.jpg

  3. Click on the Conversions tool link
    fbf76f6.jpg
  4. Find and click on the name of the conversion you’ve just created (in my case is Tessitura Installation, you may rename it as well)
  5. On the next page choose how you will setup up conversion tracking.
    I chose Put tacking code into the app
    fbfnhhr.jpg
  6. Download the Google Conversion Tracking SDK
    fhf7ryyr6gfhgf.jpg
  7. Open your project in XCode, unzip the downloaded file and drag the entire folder into your project. Make sure you have the Add to target selected
    gf9g98.jpg
  8. The SDK library references the iOS AdSupport development framework which may not already be part of your project. To add this framework, open the Link Binary With Libraries dropdown under the Build Phases tab. Add the framework from the iOS SDK using the + button.
    fy7f333.jpg
  9. Also, you need to add -ObjC to the Other Linker Flags of your application target’s build setting:
    1. In Xcode’s project navigator, press the blue top-level project icon.
    2. Click on your target, then the Build Settings tab.
    3. Under Linking > Other Linker Flags, add -ObjC to both Debug and Release.
      hfnr6gf.jpg
  10. Finally you need to add the [ACTConversionReporter…] code snippet to your AppDelegate.m in the didFinishLaunchingWithOptions
    bcmd334.jpg
  11. Now when you run your project you should get a successful ping to Google in your projects console’s window
    4rfvgttb.jpg
  12. If you go back to the conversions pages in Google Adwords you will eventually see a change in the Tracking Status column saying Recording Conversions (google says it take a couple of days, it work sooner for me)
    fhf766r33e.jpg
  13. It is important that you add a Call-to-action overlay on your promo video.
    savet2.jpg
    So go to your video edit page on your YouTube account and choose the Call-to-action overlay tab. Add a headline, a display URL (I used my website mDecks.com), a destination URL (use the complete iTunes Store url for your app without the https:// itunes.apple.com/us/app/tessitura-pro/id1144493337?ls=1&mt=8 ) and your app’s icon as a 74×74 image
    tesacori.jpg

 

That’s all. Now I am able to track every single Tessitura Pro installation from the promo video I’ve created as a conversion.

Advertisements

iTunes Connect error -22421 solved!

Submitting and app to iTunes Connect without errors is not as simple as it should be, although the review process is now so much faster than it used to be. Back in 2015 app reviews would take almost a week before you knew if the app had been rejected or not. These days (and I am talking November 2016) app reviews take only one day before they are live on the App Store.

Today I tried uploading a new version of Tessitura Pro 1.9.4 to iTunes Connect from within XCode 8.1 and I got an error with code -22421

tessituraweb1

Searching on the internet I couldn’t find any case that applied to mine. Apparently -22421 is returned as an error code for several different reasons.

Here’s the problem I had: On my previous (most recent) version of Tessitura, I selected 9.3 as the iOS version in Deployment Target. But for some reason after opening the project in  XCode 8.1 my deployment target had changed to 8.4

tss34

I am guessing you can’t downgrade the deployment target on an app (although I don’t this for a fact). I changed it back to 9.3 and the new Tessitura built uploaded without any problems

In this new version I am adding a google adwords conversion tracking snippet to track installs on the app and I thought that was the problem. I still don’t know whether the tracking code will be accepted in the review process, but I will write a new post with my findings once it’s worked.

Putting iPhone’s CoreMotion Technology to the Test

In our latest music app “Sounds of Christmas by mDecks Music” we wanted to create the illusion of playing a virtual instrument with your iPhone in 3D space. The user would hit “AIR” as if it were Bells and the iPhone will produce sound (notes, chords, etc)

To achieve such effect, we had to user iPhone’s CoreMotion Technology which allows access to all the iPhone’s motion  detection system. You can read more about it on my previous post Turning an iPhone into a Musical Instrument

Now, if you have ever worked with MIDI, Samplers and Sequencing software, you know what latency is. For those who do not know: Latency is the amount of time that takes all your software and hardware to turn an “action” (command, process, etc) into sound.

There’s always latency involved in sound production, regardless if you are using computers or not.  The speed of sound is not infinite which could be interpreted as already having an embedded amount of latency ( you know how in a tennis match you see the ball being hit and much later you hear the sound.) There’s latency when you play a real Acoustic Piano, since your action of pressing a key needs to be mechanically translated into hitting a string with a hammer. Of course these conversions are so fast that the pianist would not notice them (or at least can cope with)

When using MIDI, software and samplers, we add a full new set of conversions that need to be done before sound is finally produced. Your action in the keyboard is turn into a midi signal, the midi signal is turn into a note request for the sampler which has to ask the computer to access the hard drive (or memory if possible) and read the sound file which is then processed into a music note sent to your sound card which converts that information into sound and output it via your speakers.

An acceptable latency is one that produces the sound fast enough for you to perceive it as being in real time, and does not affect your playing no matter how fast you play. Musicians are tuned to feel that delay much better than the average listener and a delay of 20ms is big enough to be noticeable. An acceptable latency for musicians is below 12ms.

If we want to use the iPhone motion to trigger notes we then have to treat that motion as if it were a key been played on a keyboard, then turn it into MIDI send it to a sampler, etc. It was crucial to create a method that could detect and process motion in a very short amount of time (<1ms) and the question was if iPhone’s CoreMotion was fast enough to make this possible.

After our first test we realized that not only iPhone’s CoreMotion is incredibly fast but it is also so precise that we were able to simulate virtual Bells begin hit by the iPhone at any tempo we wanted.

Christmas Sounds by mDecks Music turned also into a rhythmic and sight-reading training tool where the user can perform a song at any tempo and, we decided to include rhythm scores of all songs for music teachers and students since this is such an intuitive way of learning and practicing rhythms

You can print all scores from your iPhone within the app or, you may download the eBook in PDF format with all songs notated as rhythm.

Turning the iPhone into a Musical Instrument

While developing our new Christmas Puzzles by mDecks Music app we did an experiment with the CoreMotion module of the iPhone. CoreMotion lets you detect and track movements of the iPhone in real time.

There are many different approaches to implement apps that can handle iPhone movement. The most frequently used technique is to use a method that detects the iPhone being shake but it is not precise and did not work for us. For a more precise tracking method you need to use CoreMotion.

In our solution, we were able to implement a routine that would detect the iPhone movement when the user pretends to be hitting imaginary bells in front of him/her. Our algorithm ended up being very simple and, the response, really fast and efficient. We managed to create different detection algorithms and included in the app.

We created music files that would play a song using the rhythm played by the user as source. You may print the rhythm score to practice our rhythmic sight reading and/or learn how to read and play rhythms while performing a real song, which we think makes practicing rhythms very intuitive and fun.

We turn our first verision into an All-Christmas app called : Sounds of Christmas by mDecks Music which at the moment is waiting for review on the App Store. ( Sounds of Christmas webpage )

Here are our video demos:

To do this you need a CMMotionManager.

– (CMMotionManager *)motionManager
{
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        motionManager = [[CMMotionManager alloc] init];
    });
    return motionManager;
}

and then start updating the iphone motion at a specific updateInterval.

– (void)startUpdatingWithFrequency:(int)fv
{
    NSTimeInterval delta = 0.005;
    NSTimeInterval updateInterval = 0.01 + delta * fv;
    if ([motionManager isAccelerometerAvailable] == YES) {
        [motionManager setAccelerometerUpdateInterval:updateInterval];
        [motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
            [self CALL-YOUR-ROUTINE-HERE];
        }];       
    }
}

You can get all the updated information by using:

motion.userAcceleration.x; //or y or z
motion.attitude.roll; // or pitch or yaw

Once you’ve obtained this data in real time, it’s up to you to create an algorithm that would interpret the device’s movement. It is crucial to set up a coherent updateInterval. It can’t be too short or too long. You have to find it by trial and error.