Tagged: accelerometer
Turning the iPhone into a Musical Instrument
While developing our new Christmas Puzzles by mDecks Music app we did an experiment with the CoreMotion module of the iPhone. CoreMotion lets you detect and track movements of the iPhone in real time.
There are many different approaches to implement apps that can handle iPhone movement. The most frequently used technique is to use a method that detects the iPhone being shake but it is not precise and did not work for us. For a more precise tracking method you need to use CoreMotion.
In our solution, we were able to implement a routine that would detect the iPhone movement when the user pretends to be hitting imaginary bells in front of him/her. Our algorithm ended up being very simple and, the response, really fast and efficient. We managed to create different detection algorithms and included in the app.
We created music files that would play a song using the rhythm played by the user as source. You may print the rhythm score to practice our rhythmic sight reading and/or learn how to read and play rhythms while performing a real song, which we think makes practicing rhythms very intuitive and fun.
We turn our first verision into an All-Christmas app called : Sounds of Christmas by mDecks Music which at the moment is waiting for review on the App Store. ( Sounds of Christmas webpage )
Here are our video demos:
To do this you need a CMMotionManager.
– (CMMotionManager *)motionManager
{
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
motionManager = [[CMMotionManager alloc] init];
});
return motionManager;
}
and then start updating the iphone motion at a specific updateInterval.
– (void)startUpdatingWithFrequency:(int)fv
{
NSTimeInterval delta = 0.005;
NSTimeInterval updateInterval = 0.01 + delta * fv;
if ([motionManager isAccelerometerAvailable] == YES) {
[motionManager setAccelerometerUpdateInterval:updateInterval];
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
[self CALL-YOUR-ROUTINE-HERE];
}];
}
}
You can get all the updated information by using:
motion.userAcceleration.x; //or y or z
motion.attitude.roll; // or pitch or yaw
Once you’ve obtained this data in real time, it’s up to you to create an algorithm that would interpret the device’s movement. It is crucial to set up a coherent updateInterval. It can’t be too short or too long. You have to find it by trial and error.