Posts tagged ‘iPhone SDK’

iPhone Game Development presentation (CIDUG meeting, August 25, 2009)

Mac Liaw gave a presentation on iPhone game development at the Columbus iPhone Developer User Group on August 25, 2009. He mostly talked about how the iPhone is a very different platform to develop games for than the typical video game consoles.

I can sort of understand some of what he was talking about. In my experience developing games for the Atari 2600, a one or two person team can, with a good idea, produce a killer product. (Of course, the iPhone has at minimum 128 megabytes of RAM, while the Atari 2600 had 128 bytes of RAM, which is a topic for another blog post.) I have been kicking around a couple of ideas for games, now all I have to do is learn OpenGL ES. And find an artist who will work for peanuts.

How to Build an iPhone App that Doesn’t Suck

A guest lecture for the Stanford University iPhone Application Programming class called “How to Build an iPhone App that Doesn’t Suck!” was hosted by Steve Marmon on May 8, 2009. Steve covered some of his ideas on, shockingly enough, the guidelines of designing a good iPhone application.

One thing he pointed out that I found interesting was the process of UI layout. He pointed out that an interface should be designed 10 times instead of just once, the theory being that by the time you reach the 10th design of the interface, you have fleshed out all of the ideas for the interface.

Stanford iPhone App Programming lecture 15

Lecture 15 from the Stanford University iPhone Application Programming class was hosted by Justin Santamaria from Apple. Justin covered the photo picker, Core Location, and accelerometer topics during his presentation, none of which were particularly pertinent to my current projects.

Unfortunately, Justin did not show live demos of these components, which is understandable considering the fact that the simulator has limited support for these components. The code snippets in the slides should be sufficient for getting these things up and running.

At the end of the presentation, Justin also covered some hints on maximizing battery life on the platform.

Stanford iPhone App Programming lecture 14

Lecture 14 from the Stanford University iPhone Application Programming class was hosted by Steve Demeter from Demiforce and Josh Shaffer from Apple. Steve talked about his experience in putting together Trism, and his ideas on the touch interface on the iPhone. Josh then talked technically about the methods, events, and objects that make up the touch interface on the iPhone, and how to use them.

I probably should have watched this video before trying to do swipe detection (see this blog post) in my application, it probably would have saved me some search time and trial and error. Josh used a CGAffineTransform structure to track changes, which is probably the recommended way of doing this sort of thing. It is a bit of overkill for my particular application, as I don’t need to know about the zooming and rotations, I literally just needed to know if the person’s finger was moving left or right.

Three20 Presentation (CIDUG meeting, July 28, 2009)

Justin Searls gave a very good presentation on the Three20 toolkit for the iPhone SDK. The presentation was given at the Columbus iPhone Developer User Group on July 28, 2009.

The Three20 toolkit has a lot of interesting additions and extensions to the iPhone SDK, the most used of which is a photo and thumbnail browser that is based on the ones written for the Facebook iPhone application.

Here is a link to the Columbus iPhone Developers User Group:

CIDUG

And here is a link to Justin’s posting to the group, which includes instructions on where to find the Three20 code and information on installation and usage:

7/28 CIDUG Meeting on Three20

Stanford iPhone App Programming lecture 13

Lecture 13 from the Stanford University iPhone Application Programming class was hosted by Alan Cannistraro. He covered exceptions and debugging, using the UISearchBar, notifications, and key value coding.

I did not know you could set a breakpoint on objc_exception_throw, this seems like an awesome way to try and track down exactly where exceptions are happening in the code. Also, some of the key value coding and key value observing items he covered were pretty interesting and relevant.

Simple swipe detection in the iPhone SDK

A quick round of hallway usability testing of my iPhone app revealed that apparently, iPhone and iPod Touch users like to swipe and don’t have any idea that the UIPageControl (the bar with the little white dots that show which page of information you are on) can be tapped on to move the page number. Instead, my admittedly small sample size of 2 (thanks John and Ben) were trying to swipe all over the view to move it from one page to another. As a result, I set out to try and find information about detecting the swipe motion in the iPhone SDK.

Quick searches of the internet and documentation did not reveal any immediate solutions, so I set about on the task of trying to figure out how to do it on my own. The first thing I noticed is that you can hook into the touchesBegan event for the view, so I created this code:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
	CGPoint pt;
	NSSet *allTouches = [event allTouches];
	if ([allTouches count] == 1)
	{
		UITouch *touch = [[allTouches allObjects] objectAtIndex:0];
		if ([touch tapCount] == 1)
		{
			pt = [touch locationInView:self.view];
			touchBeganX = pt.x;
			touchBeganY = pt.y;
		}
	}
}

(After adding the following variable definitions at the top of my view controller implementation file, of course.)

int touchBeganX, touchBeganY;

So now I have the position where a single touch began. The next thing I thought I would do would be to subclass the touchesEnded event, but for some reason, the tapCount of the touch that I read in touchesEnded would be zero, and that seemed a little confusing to me as to why that would be. I then turned to the touchesMoved event:

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
	CGPoint pt;
	NSSet *allTouches = [event allTouches];
	if ([allTouches count] == 1)
	{
		UITouch *touch = [[allTouches allObjects] objectAtIndex:0];
		if ([touch tapCount] == 1)
		{
			pt = [touch locationInView:self.view];
			touchMovedX = pt.x;
			touchMovedY = pt.y;
		}
	}
}

Yes, this looks shockingly similar to the code for touchesBegan. By the way, don’t forget more variable definitions:

int touchMovedX, touchMovedY;

I can now track where the touch began and the last place it moved to. I then swing back around to the touchesEnded event to do the actual swipe detection, as even though the tapCount is zero, I think I can safely determine that a single touch and drag event occurred and then test it to see if it is a swipe.

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
	NSSet *allTouches = [event allTouches];
	if ([allTouches count] == 1)
	{
		int diffX = touchMovedX - touchBeganX;
		int diffY = touchMovedY - touchBeganY;
		if (diffY >= -20 && diffY <= 20) 		
		{
 			if (diffX > 20)
			{
				NSLog(@"swipe right");
				// do something here
			}
			else if (diffX < -20)
			{
				NSLog(@"swipe left");
				// do something else here
			}
		}
	}
}

Perhaps reading swipes is so simple to do in the iPhone SDK that I missed it in the docs or online. If anyone has any better ways to do this, or a web site or blog post that explains it better, please let me know.


Somewhat related, somewhat unrelated tangent alert:

My father used to have this bag that said “I Like Swipe”, I think he said that he used to sell the product when he was a boy back in the 1940s, it was some kind of cleaning chemical. I wish he still had that bag, it was pretty retro looking and a picture of it would have fit in perfectly with this posting. I guess I will just have to settle with this:
&quot;I Like Swipe&quot;

Setting up an Objective-C delegate for the iPhone SDK

Once Ben Gottlieb of Standalone helped me get past the problem of creating and displaying my main view controller class, I set out to figure out how to do an Objective-C delegate. Unfortunately, none of the examples or tutorials I could find on the internet had a simple, straightforward explanation of what I needed to do to get it working.

After much trial and error, and reading up on as many descriptions on delegates that I could find, I got it working. Here are the steps that I had to go through in my code to get it working:

In my view subclass header file, this line was added to the top:

@protocol KeyboardViewDelegate;

This line was added to the class definition in the subclass header file:

id <KeyboardViewDelegate> delegate;

This line was added to the property definitions in the subclass header file:

@property (nonatomic, assign) id delegate;

And this section was added to the bottom of the subclass header file:

@protocol KeyboardViewDelegate<NSObject>
- (void) keyPressHandler:(int)keyPress;
@end

In my view subclass implementation file, this synthesize was added near the top:

@synthesize delegate;

And this section was added to the method in the subclass view that is wired up (via Interface Builder) to be called when a button is pressed on the view:

if(delegate && [delegate respondsToSelector:@selector(keyPressHandler:)]) {
    [delegate keyPressHandler:key];
}

In the view controller header file, of course the view subclass header needs to be included:

#import "KeyboardView.h"

I had to add the delegate name as part of the interface definition in the view controller header, adding the delegate name in angle brackets, changing it to look like this:

@interface MainViewController : UIViewController <KeyboardViewDelegate> {

And at the bottom of the view controller header file, I added a definition for the method that matches up with the one from the keyboard view:

- (void) keyPressHandler:(int)keyPress;

In the view controller implementation file, again the view header subclass needs to be included:

#import "KeyboardView.h"

In the viewDidLoad method of the view controller, during the creation of the view subclass, I had to set the subclass delegate to be the view controller:

CGRect sizeRect = CGRectMake(0, 225, 320, 190);
KeyboardView *view = [[KeyboardView alloc] init];
view.view.frame = sizeRect;
[view setDelegate:self];  // this line is pretty important !!!
[self.view addSubview:view.view];

And at the bottom of the view controller implementation file, thie method defined above is added:

- (void) keyPressHandler:(int)keyPress;
{
    NSLog(@"keyPressHandler just fired");
    // do something here
}

And that is it! When running in the simulator, the method in the view controller is fired whenever the method in the view subclass is fired.

Oh, and by the way, I am using a UIPageControl on my view controller, and I found out that, for some reason, I had to go into the viewDidLoad method and add the following line:

pageControl.backgroundColor = [UIColor grayColor];

If I did not add this line, the page control would never be visible, even though it was working, as when I tapped on either side of it, the code associated with it fired correctly. The background color of the page control was set to gray in Interface Builder. Anyone have any ideas why that would be?

Stanford iPhone App Programming lecture 12

Lecture 12 from the Stanford University iPhone Application Programming class was hosted by Alex Aybes. Alex covered using the address book functionality provided in the iPhone SDK.

His talk included several demonstrations of the C-based Address Book API, along with some of the ins and outs of Core Foundation, which shares many of the same concepts of the Objective-C based Foundation framework. His demos showed how to use the various person view controllers, getting person entries from the user’s Contacts applications, and diving into these entries to read and update the values existing in the person entry.

This does not directly apply to any of the applications I am currently working on, but no doubt it eventually will be useful in my development efforts. If you have a need to access the address book entries in the iPhone Contacts application, this is the presentation you want to check out.

Stanford iPhone App Programming lecture 11

Lecture 11 from the Stanford University iPhone Application Programming class was hosted by Evan Doll, who is apparently an achiever. He covered text input and displaying views modally, both of which I have had trouble with in my work on my various iPhone app projects that I have started.

His talk included a demonstration on the Clang code analyzer, using the keyboard with UITextView and UITextField controls, displaying and dismissing view controllers modally, and the proper way to dismiss said modal view controllers by setting up delegation in the parent view controller.

I have been developing an application in which it would be awesome to have some kind of custom virtual keyboard for entering data, as none of the canned keyboards seem to exactly fit the bill. This presentation should help me untangle some of the complexities that I have run across in trying to get this application done. Maybe then I too can be an achiever.