Search Results

Search found 16011 results on 641 pages for 'iphone provisioning'.

Page 257/641 | < Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >

  • Texture and Lighting Issue in 3D world

    - by noah
    Im using OpenGL ES 1.1 for iPhone. I'm attempting to implement a skybox in my 3d world and started out by following one of Jeff Lamarches tutorials on creating textures. Heres the tutorial: iphonedevelopment.blogspot.com/2009/05/opengl-es-from-ground-up-part-6_25.html Ive successfully added the image to my 3d world but am not sure why the lighting on the other shapes has changed so much. I want the shapes to be the original color and have the image in the background. Before: https://www.dropbox.com/s/ojmb8793vj514h0/Screen%20Shot%202012-10-01%20at%205.34.44%20PM.png After: https://www.dropbox.com/s/8v6yvur8amgudia/Screen%20Shot%202012-10-01%20at%205.35.31%20PM.png Heres the init OpenGL: - (void)initOpenGLES1 { glShadeModel(GL_SMOOTH); // Enable lighting glEnable(GL_LIGHTING); // Turn the first light on glEnable(GL_LIGHT0); const GLfloat lightAmbient[] = {0.2, 0.2, 0.2, 1.0}; const GLfloat lightDiffuse[] = {0.8, 0.8, 0.8, 1.0}; const GLfloat matAmbient[] = {0.3, 0.3, 0.3, 0.5}; const GLfloat matDiffuse[] = {1.0, 1.0, 1.0, 1.0}; const GLfloat matSpecular[] = {1.0, 1.0, 1.0, 1.0}; const GLfloat lightPosition[] = {0.0, 0.0, 1.0, 0.0}; const GLfloat lightShininess = 100.0; //Configure OpenGL lighting glEnable(GL_LIGHTING); glEnable(GL_LIGHT0); glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT, matAmbient); glMaterialfv(GL_FRONT_AND_BACK, GL_DIFFUSE, matDiffuse); glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR, matSpecular); glMaterialf(GL_FRONT_AND_BACK, GL_SHININESS, lightShininess); glLightfv(GL_LIGHT0, GL_AMBIENT, lightAmbient); glLightfv(GL_LIGHT0, GL_DIFFUSE, lightDiffuse); glLightfv(GL_LIGHT0, GL_POSITION, lightPosition); // Define a cutoff angle glLightf(GL_LIGHT0, GL_SPOT_CUTOFF, 40.0); // Set the clear color glClearColor(0, 0, 0, 1.0f); // Projection Matrix config glMatrixMode(GL_PROJECTION); glLoadIdentity(); CGSize layerSize = self.view.layer.frame.size; // Swapped height and width for landscape mode gluPerspective(45.0f, (GLfloat)layerSize.height / (GLfloat)layerSize.width, 0.1f, 750.0f); [self initSkyBox]; // Modelview Matrix config glMatrixMode(GL_MODELVIEW); glLoadIdentity(); // This next line is not really needed as it is the default for OpenGL ES glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glDisable(GL_BLEND); // Enable depth testing glEnable(GL_DEPTH_TEST); glDepthFunc(GL_LESS); glDepthMask(GL_TRUE); } Heres the drawSkybox that gets called in the drawFrame method: -(void)drawSkyBox { glDisable(GL_LIGHTING); glDisable(GL_DEPTH_TEST); glEnableClientState(GL_VERTEX_ARRAY); glEnableClientState(GL_NORMAL_ARRAY); glEnableClientState(GL_TEXTURE_COORD_ARRAY); static const SSVertex3D vertices[] = { {-1.0, 1.0, -0.0}, { 1.0, 1.0, -0.0}, {-1.0, -1.0, -0.0}, { 1.0, -1.0, -0.0} }; static const SSVertex3D normals[] = { {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0} }; static const GLfloat texCoords[] = { 0.0, 0.5, 0.5, 0.5, 0.0, 0.0, 0.5, 0.0 }; glLoadIdentity(); glTranslatef(0.0, 0.0, -3.0); glBindTexture(GL_TEXTURE_2D, texture[0]); glVertexPointer(3, GL_FLOAT, 0, vertices); glNormalPointer(GL_FLOAT, 0, normals); glTexCoordPointer(2, GL_FLOAT, 0, texCoords); glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); glDisableClientState(GL_VERTEX_ARRAY); glDisableClientState(GL_NORMAL_ARRAY); glDisableClientState(GL_TEXTURE_COORD_ARRAY); glEnable(GL_LIGHTING); glEnable(GL_DEPTH_TEST); } Heres the init Skybox: -(void)initSkyBox { // Turn necessary features on glEnable(GL_TEXTURE_2D); glEnable(GL_BLEND); glBlendFunc(GL_ONE, GL_SRC_COLOR); // Bind the number of textures we need, in this case one. glGenTextures(1, &texture[0]); // create a texture obj, give unique ID glBindTexture(GL_TEXTURE_2D, texture[0]); // load our new texture name into the current texture glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); NSString *path = [[NSBundle mainBundle] pathForResource:@"space" ofType:@"jpg"]; NSData *texData = [[NSData alloc] initWithContentsOfFile:path]; UIImage *image = [[UIImage alloc] initWithData:texData]; GLuint width = CGImageGetWidth(image.CGImage); GLuint height = CGImageGetHeight(image.CGImage); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); void *imageData = malloc( height * width * 4 ); // times 4 because will write one byte for rgb and alpha CGContextRef cgContext = CGBitmapContextCreate( imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big ); // Flip the Y-axis CGContextTranslateCTM (cgContext, 0, height); CGContextScaleCTM (cgContext, 1.0, -1.0); CGColorSpaceRelease( colorSpace ); CGContextClearRect( cgContext, CGRectMake( 0, 0, width, height ) ); CGContextDrawImage( cgContext, CGRectMake( 0, 0, width, height ), image.CGImage ); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData); CGContextRelease(cgContext); free(imageData); [image release]; [texData release]; } Any help is greatly appreciated.

    Read the article

  • iPad Video Playback only delivers audio, not visuals.

    - by Dwaine Bailey
    Hi guys, Recently we've developed an iPhone app for an external company, and everything works fine in the app. There is a section where the app pulls video from the client's server, and streams it into the iPhone's MPMoviePlayerController. This works fine on the iPhone and iPodTouch - both the video and the audio show up just great. The problem, however, is that when the app is run on an iPad (using the iPad's iPhone simulator thingo that it does) only the audio plays, and no video can be seen. Does anybody have any suggestions about what may be causing this? I thought perhaps it was the encoding, but then why would this prevent the video from playing on the iPad, and not the iPhone?

    Read the article

  • Cocos2d: Moving background on update: offsett issue

    - by mm24
    working with Objective C, iOS and Cocos2d I am developing a vertical scrolling shooter game for iPhone (retina display models with 640 width x 960 height pixel resolution). My basic algorithm works as following: I create two instances of an image that has exactly 640 width x 960 height pixel of resolution, which we will call imageA and imageB I then set the two imags with exactly 480.0f of offset from each other, as the screenSize of a CCScene is set by default to 480.0f. At each update method call I move the two images by the same value. I make sure that their offsett stays to 480.0f However when running the game I see a 1 pixel height line between the two images. This literally bugs me and would like to adjust this. What am I doing wrong? This is a zoom in on the background when the "offsett line" is visible. The white line you can see divides the two background images and is not meant to exist as both images are completely black :): If I change the yPositionOfSecondElement value to 479.0f until the first loop the two images overlap correctly, but as soon as the loop starts the two images starts having an offsett of -1.0f. Here is the initialization code: -(void) init { //... screenHeight = 480.0f; yPositionOfSecondElement= screenHeight;//I tried subtracting an offsett of -1 but eventually the image would go wrong again yPositionOfFirstElement = 0.0f; loopedBackgroundImageInstanceA = [BackgroundLoopedImage loopImageForLevel:levelName]; loopedBackgroundImageInstanceA.anchorPoint = CGPointMake(0.5f, 0.0f); loopedBackgroundImageInstanceA.position = CGPointMake(160.0f, yPositionOfFirstElement); [node addChild:loopedBackgroundImageInstanceA z:zLevelBackground]; //loopedBackgroundImageInstanceA.color= ccRED; loopedBackgroundImageInstanceB = [BackgroundLoopedImage loopImageForLevel:levelName]; loopedBackgroundImageInstanceB.anchorPoint = CGPointMake(0.5f, 0.0f); loopedBackgroundImageInstanceB.position = CGPointMake(160.0f, yPositionOfSecondElement); [node addChild:loopedBackgroundImageInstanceB z:zLevelBackground]; //.... } And here is the move code called at each update: -(void) moveBackgroundSprites:(BackgroundLoopedImage*)imageA :(BackgroundLoopedImage*)imageB :(ccTime)delta { isEligibleToMove=false; //This is done to avoid rounding errors float yStep = delta * [GameController sharedGameController].currentBackgroundSpeed; NSString* formattedNumber = [NSString stringWithFormat:@"%.02f", yStep]; yStep = atof([formattedNumber UTF8String]); //First should adjust position of images [self adjustPosition:imageA :imageB]; //The can get the actual image position CGPoint posA = imageA.position; CGPoint posB = imageB.position; //Here could verify if the checksum is equal to the required difference (should be 479.0f) if (![self verifyCheckSum:posA :posB]) { CCLOG(@"does not comply A"); } //At this stage can compute the hypotetical new position CGPoint newPosA = CGPointMake(posA.x, posA.y - yStep); CGPoint newPosB = CGPointMake(posB.x, posB.y - yStep); // Reposition stripes when they're out of bounds if (newPosA.y <= -yPositionOfSecondElement) { newPosA.y = yPositionOfSecondElement; [imageA shuffle]; if (timeElapsed>=endTime && hasReachedEndLevel==FALSE) { hasReachedEndLevel=TRUE; shouldMoveImageEnd=TRUE; } } else if (newPosB.y <= -yPositionOfSecondElement) { newPosB.y = yPositionOfSecondElement; [imageB shuffle]; if (timeElapsed>=endTime && hasReachedEndLevel==FALSE) { hasReachedEndLevel=TRUE; shouldMoveImageEnd=TRUE; } } //Here should verify that the check sum is equal to 479.0f if (![self verifyCheckSum:posA :posB]) { CCLOG(@"does not comply B"); } imageA.position = newPosA; imageB.position = newPosB; //Here could verify that the check sum is equal to 479.0f if (![self verifyCheckSum:posA :posB]) { CCLOG(@"does not comply C"); } isEligibleToMove=true; } -(BOOL) verifyCheckSum:(CGPoint)posA :(CGPoint)posB { BOOL comply = false; float sum = 0.0f; if (posA.y > posB.y) { sum = posA.y - posB.y; } else if (posB.y > posA.y){ sum = posB.y - posA.y; } else{ return false; } if (sum!=yPositionOfSecondElement) { comply= false; } else{ comply=true; } return comply; } And here is what happens on the update: if(shouldMoveImageA && shouldMoveImageB) { if (isEligibleToMove) { [self moveBackgroundSprites:loopedBackgroundImageInstanceA :loopedBackgroundImageInstanceB :delta]; } Forget about shouldMoveImageA and shouldMoveImageB, this is just for when the background reaches the end of level, this works.

    Read the article

  • Looking for alternative to NetNewsWire and Google Reader

    - by Janine Sisk
    I have used NetNewsWire on both Mac and iPhone for years, and have been reasonably happy. Unfortunately the latest version of NNW for Mac, which I just upgraded to, now matches Google Reader's policy of marking items read after 30 days. The older version I had been using did not do this. I regularly keep unread items for way longer than 30 days, and do eventually read them, so this is a major bummer for me. The iPhone version has behaved this way forever, but I was ok with that as long as I had the full list on my desktop. Keeping in sync between the iPhone and Mac is very important, so I need to find another online solution with clients available on both sides. I've done a little digging but haven't found much; it seems like Google has pretty much taken over this space. Any suggestions?

    Read the article

  • iTunes Home Sharing with VPN

    - by Philip Crumpton
    I'm trying to set up a VPN on a Windows machine that contains my iTunes library, then connect my iPhone ot it, wirelessly, using Home Sharing (remotly). I have read that this can easily be set up if the iTunes library is on a Mac (Network Beacon and YazSoft ShareTool are two products I found). I can't find anyone who has had success on a Windows machine, though. In my thinking, there are a few options to getting this done. Find a utility that takes care of this for me (like the Mac-only options listed above) and is compatible with iPhone (Hamachi is NOT compatible with iPhone VPN). Manually configure a VPN to allow Bonjour multicast (I'm not sure what this really is...) Emulate a Mac on my Windows PC. FYI my router is a Linksys WRT54GL running Tomato 1.28 Note question is related to this

    Read the article

  • Why does the Mobile Safari Browser on iOS not allow file uploads? [migrated]

    - by Kirinriki
    As already known, it's not possible for iOS users to select image files to upload from Safari on iPhone, because the browse button to display the "select file"- dialog is disabled. It works fine on Android, but not on iPhone... What is the particular reason for that issue? I heard that the browse button is disabled because there isn't a file browser on the iPhone. Someone other said that Safari just disabled root access. Is there any reliable source which explains the issue? (I need it for my thesis.)

    Read the article

  • CCSpriteHole in cocos2d 2.0?

    - by rakkarage
    i was using this cocos2d class CCSpriteHole in cocos2d 1.0 fine... http://jpsarda.tumblr.com/post/15779708304/new-cocos2d-iphone-extensions-a-progress-bar-and-a i am trying to convert it to cocos2d 2.0... i got it to compile by changing glVertexPointer to glVertexAttribPointer like in the 2.0 version of CCSpriteScale9 here http://jpsarda.tumblr.com/post/9162433577/scale9grid-for-cocos2d and changing contentSizeInPixels_ to contentSize_... -(id) init { if( (self=[super init]) ) { opacityModifyRGB_ = YES; opacity_ = 255; color_ = colorUnmodified_ = ccWHITE; capSize=capSizeInPixels=CGSizeZero; //Not used blendFunc_.src = CC_BLEND_SRC; blendFunc_.dst = CC_BLEND_DST; // update texture (calls updateBlendFunc) [self setTexture:nil]; // default transform anchor anchorPoint_ = ccp(0.5f, 0.5f); vertexDataCount=24; vertexData = (ccV2F_C4F_T2F*) malloc(vertexDataCount * sizeof(ccV2F_C4F_T2F)); [self setTextureRectInPixels:CGRectZero untrimmedSize:CGSizeZero]; } return self; } -(id) initWithTexture:(CCTexture2D*)texture rect:(CGRect)rect { NSAssert(texture!=nil, @"Invalid texture for sprite"); // IMPORTANT: [self init] and not [super init]; if( (self = [self init]) ) { [self setTexture:texture]; [self setTextureRect:rect]; } return self; } -(id) initWithTexture:(CCTexture2D*)texture { NSAssert(texture!=nil, @"Invalid texture for sprite"); CGRect rect = CGRectZero; rect.size = texture.contentSize; return [self initWithTexture:texture rect:rect]; } -(id) initWithFile:(NSString*)filename { NSAssert(filename!=nil, @"Invalid filename for sprite"); CCTexture2D *texture = [[CCTextureCache sharedTextureCache] addImage: filename]; if( texture ) return [self initWithTexture:texture]; return nil; } +(id)spriteWithFile:(NSString*)f { return [[self alloc] initWithFile:f]; } - (void) dealloc { if (vertexData) free(vertexData); } -(void) updateColor { ccColor4F color4; color4.r=(float)color_.r/255.0f; color4.g=(float)color_.g/255.0f; color4.b=(float)color_.b/255.0f; color4.a=(float)opacity_/255.0f; for (int i=0; i<vertexDataCount; i++) { vertexData[i].colors=color4; } } -(void)updateTextureCoords:(CGRect)rect { CCTexture2D *tex = texture_; if(!tex) return; float atlasWidth = (float)tex.pixelsWide; float atlasHeight = (float)tex.pixelsHigh; float left,right,top,bottom; left = rect.origin.x/atlasWidth; right = left + rect.size.width/atlasWidth; top = rect.origin.y/atlasHeight; bottom = top + rect.size.height/atlasHeight; // // |/|/|/| // CGSize capTexCoordsSize=CGSizeMake(capSizeInPixels.width/atlasWidth, capSizeInPixels.height/atlasHeight); // From left to right //Top band // Left vertexData[0].texCoords=(ccTex2F){left,top}; vertexData[1].texCoords=(ccTex2F){left,top+capTexCoordsSize.height}; vertexData[2].texCoords=(ccTex2F){left+capTexCoordsSize.width,top}; vertexData[3].texCoords=(ccTex2F){left+capTexCoordsSize.width,top+capTexCoordsSize.height}; // Center vertexData[4].texCoords=(ccTex2F){right-capTexCoordsSize.width,top}; vertexData[5].texCoords=(ccTex2F){right-capTexCoordsSize.width,top+capTexCoordsSize.height}; // Right vertexData[6].texCoords=(ccTex2F){right,top}; vertexData[7].texCoords=(ccTex2F){right,top+capTexCoordsSize.height}; //Center band // Left vertexData[8].texCoords=(ccTex2F){left,bottom-capTexCoordsSize.height}; vertexData[9].texCoords=(ccTex2F){left,top+capTexCoordsSize.height}; vertexData[10].texCoords=(ccTex2F){left+capTexCoordsSize.width,bottom-capTexCoordsSize.height}; vertexData[11].texCoords=(ccTex2F){left+capTexCoordsSize.width,top+capTexCoordsSize.height}; // Center vertexData[12].texCoords=(ccTex2F){right-capTexCoordsSize.width,bottom-capTexCoordsSize.height}; vertexData[13].texCoords=(ccTex2F){right-capTexCoordsSize.width,top+capTexCoordsSize.height}; // Right vertexData[14].texCoords=(ccTex2F){right,bottom-capTexCoordsSize.height}; vertexData[15].texCoords=(ccTex2F){right,top+capTexCoordsSize.height}; //Bottom band //Left vertexData[16].texCoords=(ccTex2F){left,bottom}; vertexData[17].texCoords=(ccTex2F){left,bottom-capTexCoordsSize.height}; vertexData[18].texCoords=(ccTex2F){left+capTexCoordsSize.width,bottom}; vertexData[19].texCoords=(ccTex2F){left+capTexCoordsSize.width,bottom-capTexCoordsSize.height}; // Center vertexData[20].texCoords=(ccTex2F){right-capTexCoordsSize.width,bottom}; vertexData[21].texCoords=(ccTex2F){right-capTexCoordsSize.width,bottom-capTexCoordsSize.height}; // Right vertexData[22].texCoords=(ccTex2F){right,bottom}; vertexData[23].texCoords=(ccTex2F){right,bottom-capTexCoordsSize.height}; } -(void) updateVertices { float left=0; //-spriteSizeInPixels.width*0.5f; float right=left+contentSize_.width; float bottom=0; //-spriteSizeInPixels.height*0.5f; float top=bottom+contentSize_.height; float holeLeft=holeRect.origin.x*CC_CONTENT_SCALE_FACTOR(); float holeRight=holeLeft+holeRect.size.width*CC_CONTENT_SCALE_FACTOR(); float holeBottom=holeRect.origin.y*CC_CONTENT_SCALE_FACTOR(); float holeTop=holeBottom+holeRect.size.height*CC_CONTENT_SCALE_FACTOR(); // // |/|/|/| // // From left to right //Top band // Left vertexData[0].vertices=(ccVertex2F){left,top}; vertexData[1].vertices=(ccVertex2F){left,holeTop}; vertexData[2].vertices=(ccVertex2F){holeLeft,top}; vertexData[3].vertices=(ccVertex2F){holeLeft,holeTop}; // Center vertexData[4].vertices=(ccVertex2F){holeRight,top}; vertexData[5].vertices=(ccVertex2F){holeRight,holeTop}; // Right vertexData[6].vertices=(ccVertex2F){right,top}; vertexData[7].vertices=(ccVertex2F){right,holeTop}; //Center band // Left vertexData[8].vertices=(ccVertex2F){left,holeBottom}; vertexData[9].vertices=(ccVertex2F){left,holeTop}; vertexData[10].vertices=(ccVertex2F){holeLeft,holeBottom}; vertexData[11].vertices=(ccVertex2F){holeLeft,holeTop}; // Center vertexData[12].vertices=(ccVertex2F){holeRight,holeBottom}; vertexData[13].vertices=(ccVertex2F){holeRight,holeTop}; // Right vertexData[14].vertices=(ccVertex2F){right,holeBottom}; vertexData[15].vertices=(ccVertex2F){right,holeTop}; //Bottom band //Left vertexData[16].vertices=(ccVertex2F){left,bottom}; vertexData[17].vertices=(ccVertex2F){left,holeBottom}; vertexData[18].vertices=(ccVertex2F){holeLeft,bottom}; vertexData[19].vertices=(ccVertex2F){holeLeft,holeBottom}; // Center vertexData[20].vertices=(ccVertex2F){holeRight,bottom}; vertexData[21].vertices=(ccVertex2F){holeRight,holeBottom}; // Right vertexData[22].vertices=(ccVertex2F){right,bottom}; vertexData[23].vertices=(ccVertex2F){right,holeBottom}; } -(void) setHole:(CGRect)r inRect:(CGRect)totalSurface { holeRect=r; self.contentSize=totalSurface.size; holeRect.origin=ccpSub(holeRect.origin,totalSurface.origin); CGPoint holeCenter=ccp(holeRect.origin.x+holeRect.size.width*0.5f,holeRect.origin.y+holeRect.size.height*0.5f); self.anchorPoint=ccp(holeCenter.x/contentSize_.width,holeCenter.y/contentSize_.height); //[self updateTextureCoords:rectInPixels_]; [self updateVertices]; [self updateColor]; } -(void) draw { BOOL newBlend = NO; if( blendFunc_.src != CC_BLEND_SRC || blendFunc_.dst != CC_BLEND_DST ) { newBlend = YES; glBlendFunc( blendFunc_.src, blendFunc_.dst ); } glBindTexture(GL_TEXTURE_2D, [texture_ name]); glVertexAttribPointer(kCCVertexAttrib_Position, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[0].vertices); glVertexAttribPointer(kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[0].texCoords); glVertexAttribPointer(kCCVertexAttrib_Color, 4, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[0].colors); glDrawArrays(GL_TRIANGLE_STRIP, 0, 8); glVertexAttribPointer(kCCVertexAttrib_Position, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[8].vertices); glVertexAttribPointer(kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[8].texCoords); glVertexAttribPointer(kCCVertexAttrib_Color, 4, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[8].colors); glDrawArrays(GL_TRIANGLE_STRIP, 0, 8); glVertexAttribPointer(kCCVertexAttrib_Position, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[16].vertices); glVertexAttribPointer(kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[16].texCoords); glVertexAttribPointer(kCCVertexAttrib_Color, 4, GL_FLOAT, GL_FALSE, sizeof(ccV2F_C4F_T2F), &vertexData[16].colors); glDrawArrays(GL_TRIANGLE_STRIP, 0, 8); if( newBlend ) glBlendFunc(CC_BLEND_SRC, CC_BLEND_DST); } -(void)setTextureRectInPixels:(CGRect)rect untrimmedSize:(CGSize)untrimmedSize { rectInPixels_ = rect; rect_ = CC_RECT_PIXELS_TO_POINTS( rect ); //[self setContentSizeInPixels:untrimmedSize]; [self updateTextureCoords:rectInPixels_]; } -(void)setTextureRect:(CGRect)rect { CGRect rectInPixels = CC_RECT_POINTS_TO_PIXELS( rect ); [self setTextureRectInPixels:rectInPixels untrimmedSize:rectInPixels.size]; } // // RGBA protocol // #pragma mark CCSpriteHole - RGBA protocol -(GLubyte) opacity { return opacity_; } -(void) setOpacity:(GLubyte) anOpacity { opacity_ = anOpacity; // special opacity for premultiplied textures if( opacityModifyRGB_ ) [self setColor: (opacityModifyRGB_ ? colorUnmodified_ : color_ )]; [self updateColor]; } - (ccColor3B) color { if(opacityModifyRGB_){ return colorUnmodified_; } return color_; } -(void) setColor:(ccColor3B)color3 { color_ = colorUnmodified_ = color3; if( opacityModifyRGB_ ){ color_.r = color3.r * opacity_/255; color_.g = color3.g * opacity_/255; color_.b = color3.b * opacity_/255; } [self updateColor]; } -(void) setOpacityModifyRGB:(BOOL)modify { ccColor3B oldColor = self.color; opacityModifyRGB_ = modify; self.color = oldColor; } -(BOOL) doesOpacityModifyRGB { return opacityModifyRGB_; } #pragma mark CCSpriteHole - CocosNodeTexture protocol -(void) updateBlendFunc { if( !texture_ || ! [texture_ hasPremultipliedAlpha] ) { blendFunc_.src = GL_SRC_ALPHA; blendFunc_.dst = GL_ONE_MINUS_SRC_ALPHA; [self setOpacityModifyRGB:NO]; } else { blendFunc_.src = CC_BLEND_SRC; blendFunc_.dst = CC_BLEND_DST; [self setOpacityModifyRGB:YES]; } } -(void) setTexture:(CCTexture2D*)texture { // accept texture==nil as argument NSAssert( !texture || [texture isKindOfClass:[CCTexture2D class]], @"setTexture expects a CCTexture2D. Invalid argument"); texture_ = texture; [self updateBlendFunc]; } -(CCTexture2D*) texture { return texture_; } @end but now positioning and scaling seem to not work? and it starts in the wrong position... but changing the opacity still works. so i was wondering if anyone can see why my 2.0 version is not working? or if maybe there is a better way to do a sprite hole with cocos2d/opengl 2.0? shaders? thanks

    Read the article

  • Day 1: iPhone Web Development

    - by BizTalk Visionary
    Preamble: In an attempt to keep an ageing mind alive I have decided to write an iPhone web app. The app will pull together a whole host of disciplines and should be a challenge. The App: Over the last month or so I have been playing around with Google maps and Instead (see InstaMapper.com) so decided the app must build on this learning. Looking to create an Uber-where’s Tigger! Framework: After an exhaustive research period I’m going with JQTouch. tPhone Simulator:   MobiOne More to follow later……

    Read the article

  • A simple trick to play YouTube videos in background on iPhone/iPad running on iOS 7

    - by Gopinath
    YouTube is not only the best source of videos, but also for great music. Most of the Indian movie music albums are officially released on YouTube with high quality. I hear a lot of music on YouTube instead of streaming from dedicated music streaming websites as their quality is no match to YouTube. While it is possible to run YouTube in the background on desktop, it is not possible on smartphones and tablets as they stop the background YouTube app automatically. But a simple trick from tech guru labnol let you play YouTube videos on the background on iPhone/iPads running on the latest iOS 7. Here is a video demonstration of the trick Read detailed explanation of the trick at labnol and don’t miss to browse through hundreds of useful tips and tricks listed on the blog.

    Read the article

  • "Développer pour l'iPhone, c'est servir un propriétaire terrien", déclare l'inventeur du XML : posit

    Mise à jour du 16/02/10 "Ceux qui créent des applications pour l'iPhone servent la volonté d'un propriétaire terrien" Déclare le créateur du XML : y-a-t-il une fronde des développeurs contre Apple ? Pour ne pas travailler pour Oracle, le co-inventeur du XML vient de quitter Sun pour rejoindre Google où il s'en prend à... Apple. Allez comprendre. "La vision [d'Apple] du futur de l'Internet mobile exclut la controverse, le sexe et la liberté, mais elle pose des limites strictes pour contrôler qui peut savoir quoi et qui peut dire quoi. C'est un jardin aseptisé à la Disneyland, entouré de murs eu...

    Read the article

  • Develop 3d game for iphone & android in single code

    - by lajpat
    I have to make a 3d game on the lines of this app 3D Chess I want to make this app for both android & iphone by writing a single code. Of course little native code will be required to be done. I want to ensure that entire logic & animation code is written only once. What software does one suggest to achieve this since I am not a tight schedule. I came across Corona but I am not sure if such game can be made using it. Others I found Unity & Shiva. I am not experience in 3D game so please if someone can help Thanks Lajpat

    Read the article

  • Netflix Rolls Out Polished New iPhone and Android Apps [Video]

    - by Jason Fitzpatrick
    If you’re a Netflix subscriber, you’ve got a brand spanking new mobile interface to take for a spin. Last week Netflix released a brand new iOS interface, this week it’s a brand new Android interface. The above video showcases the new iOS interface for mobile playback on devices like the iPhone and iPad. The slick new layout makes it even easier to browser new content and resume watching content you’ve paused at home or on the go. For a peek at the new (and similar) Android interface, check out the video below: For more information about the respective apps, visit their download pages to read up and grab a copy. Netflix for Android / iOS How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • Webinar on Cross Platform Development with MonoTouch for the iPhone and Mono for Android on Wednesday

    - by Wallym
    The iPhone and Android are dominant in the marketplace. The two platforms currently have 85% of the smartphone marketplace and are continuing to grow that marketshare. Developers are being tasked with targeting these two platforms. In this session, we’ll take a high level look at how we can use c# and .NET knowledge to share code between iOS and and Android. We’ll look at linked files, using the Xamarin Mobile API, the challenges of running across platforms and frameworks, as well as other features of Visual Studio, Monotouch, MonoDevelop, and Mono for Android that allows us to write as much code that can run on both platforms.Here is the registration link: https://www302.livemeeting.com/lrs/8001676474/Registration.aspx?pageName=2w197495hzh0t56g

    Read the article

  • Painfully Slow iPhone Backup?

    - by Christopher House
    And now for something completely different... For the past couple of weeks, I've not been able to sync my iPhone.  What happens is that I plug it in to my computer, start the sync and two hours later, iTunes shows it's about 75% complete.  After some googling, I tried a few things like deleting old backups, changing some settings, etc, but nothing seemed to help.  Tonight I decided I'd try deleting all the photos and videos on my phone.  Sure enough, that did the trick.  I'm now able to successfully sync in a reasonable amount of time.

    Read the article

  • Building iPhone Interfaces for Oracle E-Business Suite

    - by Shay Shmeltzer
    Over on his blog Juan has been showing you how simple it is to develop a rich Web user interface with ADF accessing Oracle E-Business Suite data and even exposing it on an iPad. In this entry I'm starting from his sample application and I'm showing how easy it is to build an interface that will look great on an iPhone (or other mobile devices) using Oracle ADF Mobile Browser. For those of you who are just using ADF and never tried ADF Mobile Browser - you'll find that the development experience is quite familiar and similar to your normal Web application development. In the latest version of JDeveloper (11.1.2.1) which I'm usingin this demo we have a built-in skin that will give your application the native iOS look and feel. In the demo I achieve this by setting the styleclass of a tr:panelHeader component to af_m_toolbar to get this. For more on this styling read the doc. Check out this quick demo:

    Read the article

  • Drawing map tiles for iPhone game

    - by user17778
    I'm working on a turn-based strategy game for the iPhone that has a hexagon-grid based map in it. I'm in the process of drawing up the actual tiles for the different landscapes (i.e. forest, grassland, etc.) and was wondering what program to draw the tile images in. I would assume Adobe Illustrator since a vector-based image may allow for smooth images even when the user is zoomed in really close. Is this right? Thanks!

    Read the article

  • Options for 2D Web/iPhone/Android, with Test Framework

    - by ashes999
    I would like to make a 2D game with one codebase that runs on iPhone, Android, and web (any flavour of web -- HTML5, Flash, Silverlight, etc.) What are my options? I should be able to write my code once, and run it anywhere. I also absolutely need the ability to write unit tests (or write a unit-testing framework) -- I cannot make sizable games without testing. Unity is good, but unity is 3d; even with hacks, the graphical assets will still be 3d. I'm after 2d, not 3d. (If you need a Mac or separate licensing for the Mac part, that's okay.)

    Read the article

  • iPhone: how to use performSelector:onThread:withObject:waitUntilDone: method?

    - by Michael Kessler
    Hi all, I am trying to use a separate thread for working with some API. The problem is that I am not able to use performSelector:onThread:withObject:waitUntilDone: method with a thread that I' instantiated for this. My code: @interface MyObject : NSObject { NSThread *_myThread; } @property(nonatomic, retain) NSThread *myThread; @end @implementation MyObject @synthesize myThread = _myThread; - (NSThread *)myThread { if (_myThread == nil) { NSThread *myThreadTemp = [[NSThread alloc] init]; [myThreadTemp start]; self. myThread = myThreadTemp; [myThreadTemp release]; } return _myThread; } - (id)init { if (self = [super init]) { [self performSelector:@selector(privateInit:) onThread:[self myThread] withObject:nil waitUntilDone:NO]; } return self; } - (void)privateInit:(id)object { NSLog(@"MyObject - privateInit start"); } - (void)dealloc { [_myThread release]; _myThread = nil; [super dealloc]; } @end "MyObject - privateInit start" is never printed. What am I missing? I tried to instantiate the thread with target and selector, tried to wait for method execution completion (waitUntilDone:YES). Nothing helps. UPDATE: I don't need this multithreading for separating costly operations to another thread. In this case I could use the performSelectorInBackground as mentioned in few answers. The main reason for this separate thread is the need to perform all the actions in the API (TTS by Loquendo) from one single thread. Meaning that I have to create an instance of the TTS object and call methods on that object from the same thread all the time.

    Read the article

  • iPhone SDK: How do I create an overlay drop-down menu?

    - by arooaroo
    Hi, I have a UIViewController containing its UIView which occupies the available screen (inbetween the tabbar and nav bar). I'd now like to add a simple toolbar which is situated at the top of the UIView which will contain some buttons. One of the buttons should display a drop down menu which is displayed as an overlay over the UIView. I'd quite like this menu to be a UITable as it could contain many items. The problem I'm having is I can't see the best way to go about this. I'm wondering whether there's a simple strategy. Here's an example of the type of feature I'm looking for... Menu hidden: http://www.flickr.com/photos/peterevers/3147678219/in/photostream/ Menu displayed: http://www.flickr.com/photos/peterevers/3148550320/ I expect the above example is emulated in UIWebView using HTML/CSS. Is there a "proper" way? TIA

    Read the article

  • Assertion failure when trying to write (INSERT, UPDATE) to sqlite database on iPhone.

    - by Mark McFarlane
    I have a really frustrating error that I've spent hours looking at and cannot fix. I can get data from my db no problem with this code, but inserting or updating gives me these errors: *** Assertion failure in +[Functions db_insert_answer:question_type:score:], /Volumes/Xcode/Kanji/Classes/Functions.m:129 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Error inserting: db_insert_answer:question_type:score:' Here is the code I'm using: [Functions db_insert_answer:[[dict_question objectForKey:@"JISDec"] intValue] question_type:@"kanji_meaning" score:arc4random() % 100]; //update EF, Next_question, n here [Functions db_update_EF:[dict_question objectForKey:@"question"] EF:EF]; To call these functions: +(sqlite3_stmt *)db_query:(NSString *)queryText{ sqlite3 *database = [self get_db]; sqlite3_stmt *statement; NSLog(queryText); if (sqlite3_prepare_v2(database, [queryText UTF8String], -1, &statement, nil) == SQLITE_OK) { } else { NSLog(@"HMM, COULDNT RUN QUERY: %s\n", sqlite3_errmsg(database)); } sqlite3_close(database); return statement; } +(void)db_insert_answer:(int)obj_id question_type:(NSString *)question_type score:(int)score{ sqlite3 *database = [self get_db]; sqlite3_stmt *statement; char *errorMsg; char *update = "INSERT INTO Answers (obj_id, question_type, score, date) VALUES (?, ?, ?, DATE())"; if (sqlite3_prepare_v2(database, update, -1, &statement, nil) == SQLITE_OK) { sqlite3_bind_int(statement, 1, obj_id); sqlite3_bind_text(statement, 2, [question_type UTF8String], -1, NULL); sqlite3_bind_int(statement, 3, score); } if (sqlite3_step(statement) != SQLITE_DONE){ NSAssert1(0, @"Error inserting: %s", errorMsg); } sqlite3_finalize(statement); sqlite3_close(database); NSLog(@"Answer saved"); } +(void)db_update_EF:(NSString *)kanji EF:(int)EF{ sqlite3 *database = [self get_db]; sqlite3_stmt *statement; //NSLog(queryText); char *errorMsg; char *update = "UPDATE Kanji SET EF = ? WHERE Kanji = '?'"; if (sqlite3_prepare_v2(database, update, -1, &statement, nil) == SQLITE_OK) { sqlite3_bind_int(statement, 1, EF); sqlite3_bind_text(statement, 2, [kanji UTF8String], -1, NULL); } else { NSLog(@"HMM, COULDNT RUN QUERY: %s\n", sqlite3_errmsg(database)); } if (sqlite3_step(statement) != SQLITE_DONE){ NSAssert1(0, @"Error updating: %s", errorMsg); } sqlite3_finalize(statement); sqlite3_close(database); NSLog(@"Update saved"); } +(sqlite3 *)get_db{ sqlite3 *database; NSFileManager *fileManager = [NSFileManager defaultManager]; NSString *copyFrom = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:@"/kanji_training.sqlite"]; if([fileManager fileExistsAtPath:[self dataFilePath]]) { //NSLog(@"DB FILE ALREADY EXISTS"); } else { [fileManager copyItemAtPath:copyFrom toPath:[self dataFilePath] error:nil]; NSLog(@"COPIED DB TO DOCUMENTS BECAUSE IT DIDNT EXIST: NEW INSTALL"); } if (sqlite3_open([[self dataFilePath] UTF8String], &database) != SQLITE_OK) { sqlite3_close(database); NSAssert(0, @"Failed to open database"); NSLog(@"FAILED TO OPEN DB"); } else { if([fileManager fileExistsAtPath:[self dataFilePath]]) { //NSLog(@"DB PATH:"); //NSLog([self dataFilePath]); } } return database; } + (NSString *)dataFilePath { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; return [documentsDirectory stringByAppendingPathComponent:@"kanji_training.sqlite"]; } I really can't work it out! Can anyone help me? Many thanks.

    Read the article

  • iPhone mapView / mapKit using removeAnnotation & addAnnotation results in memory leak?

    - by user266618
    To update the location of a GPS indicator on mapView... [mapView removeAnnotation:myGpsAnnotation]; [myGpsAnnotation release]; myGpsAnnotation = nil; myGpsAnnotation = [[MapLocationAnnotation alloc] initWithCoordinate:region.center annotationType:MapAnnotationTypeGps title:MAP_ANNOTATION_TYPE_GPS]; [mapView addAnnotation:myGpsAnnotation]; ...I see net memory slowly climbing in Instruments (simulator). No "Leak" blip, but "Net Bytes" and "# Net" slowly incrementing... unless this code is commented out. So I'm 100% certain this is the offending code. BUT if I do the following... [mapView removeAnnotation:myGpsAnnotation]; [myGpsAnnotation release]; myGpsAnnotation = nil; myGpsAnnotation = [[MapLocationAnnotation alloc] initWithCoordinate:region.center annotationType:MapAnnotationTypeGps title:MAP_ANNOTATION_TYPE_GPS]; [mapView addAnnotation:myGpsAnnotation]; [mapView removeAnnotation:myGpsAnnotation]; [mapView addAnnotation:myGpsAnnotation]; [mapView removeAnnotation:myGpsAnnotation]; [mapView addAnnotation:myGpsAnnotation]; ...then the "Net Bytes" and "# Net" increase much faster. Is it possible this isn't my mistake, and I'm trying to track down a leak in MapKit? Am I really leaking memory? Again, nothing appears under "Leaks", but then I don't see why Net values would be continually climbing. Thanks for any help, -Gord

    Read the article

  • iphone: Help with AudioToolbox Leak: Stack trace/code included here...

    - by editor guy
    Part of this app is a "Scream" button that plays random screams from cast members of a TV show. I have to bang on the app quite a while to see a memory leak in Instruments, but it's there, occasionally coming up (every 45 seconds to 2 minutes.) The leak is 3.50kb when it occurs. Haven't been able to crack it for several hours. Any help appreciated. Instruments says this is the offending code line: [appSoundPlayer play]; that's linked to from line 9 of the below stack trace: 0 libSystem.B.dylib malloc 1 libSystem.B.dylib pthread_create 2 AudioToolbox CAPThread::Start() 3 AudioToolbox GenericRunLoopThread::Start() 4 AudioToolbox AudioQueueNew(bool, AudioStreamBasicDescription const*, TCACallback const&, CACallbackTarget const&, unsigned long, OpaqueAudioQueue*) 5 AudioToolbox AudioQueueNewOutput 6 AVFoundation allocAudioQueue(AVAudioPlayer, AudioPlayerImpl*) 7 AVFoundation prepareToPlayQueue(AVAudioPlayer*, AudioPlayerImpl*) 8 AVFoundation -[AVAudioPlayer prepareToPlay] 9 Scream Queens -[ScreamViewController scream:] /Users/laptop2/Desktop/ScreamQueens Versions/ScreamQueens25/Scream Queens/Classes/../ScreamViewController.m:210 10 CoreFoundation -[NSObject performSelector:withObject:withObject:] 11 UIKit -[UIApplication sendAction:to:from:forEvent:] 12 UIKit -[UIApplication sendAction:toTarget:fromSender:forEvent:] 13 UIKit -[UIControl sendAction:to:forEvent:] 14 UIKit -[UIControl(Internal) _sendActionsForEvents:withEvent:] 15 UIKit -[UIControl touchesEnded:withEvent:] 16 UIKit -[UIWindow _sendTouchesForEvent:] 17 UIKit -[UIWindow sendEvent:] 18 UIKit -[UIApplication sendEvent:] 19 UIKit _UIApplicationHandleEvent 20 GraphicsServices PurpleEventCallback 21 CoreFoundation CFRunLoopRunSpecific 22 CoreFoundation CFRunLoopRunInMode 23 GraphicsServices GSEventRunModal 24 UIKit -[UIApplication _run] 25 UIKit UIApplicationMain 26 Scream Queens main /Users/laptop2/Desktop/ScreamQueens Versions/ScreamQueens25/Scream Queens/main.m:14 27 Scream Queens start Here's .h: #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> #import <MediaPlayer/MediaPlayer.h> #import <AudioToolbox/AudioToolbox.h> #import <MessageUI/MessageUI.h> #import <MessageUI/MFMailComposeViewController.h> @interface ScreamViewController : UIViewController <UIApplicationDelegate, AVAudioPlayerDelegate, MFMailComposeViewControllerDelegate> { //AudioPlayer related AVAudioPlayer *appSoundPlayer; NSURL *soundFileURL; BOOL interruptedOnPlayback; BOOL playing; //Scream button related IBOutlet UIButton *screamButton; int currentScreamIndex; NSString *currentScream; NSMutableArray *screams; NSMutableArray *personScreaming; NSMutableArray *photoArray; int currentSayingsIndex; NSString *currentButtonSaying; NSMutableArray *funnyButtonSayings; IBOutlet UILabel *funnyButtonSayingsLabel; IBOutlet UILabel *personScreamingField; IBOutlet UIImageView *personScreamingImage; //Mailing the scream related IBOutlet UILabel *mailStatusMessage; IBOutlet UIButton *shareButton; } //AudioPlayer related @property (nonatomic, retain) AVAudioPlayer *appSoundPlayer; @property (nonatomic, retain) NSURL *soundFileURL; @property (readwrite) BOOL interruptedOnPlayback; @property (readwrite) BOOL playing; //Scream button related @property (nonatomic, retain) UIButton *screamButton; @property (nonatomic, retain) NSMutableArray *screams; @property (nonatomic, retain) NSMutableArray *personScreaming; @property (nonatomic, retain) NSMutableArray *photoArray; @property (nonatomic, retain) UILabel *personScreamingField; @property (nonatomic, retain) UIImageView *personScreamingImage; @property (nonatomic, retain) NSMutableArray *funnyButtonSayings; @property (nonatomic, retain) UILabel *funnyButtonSayingsLabel; //Mailing the scream related @property (nonatomic, retain) IBOutlet UILabel *mailStatusMessage; @property (nonatomic, retain) IBOutlet UIButton *shareButton; //Scream Button - (IBAction) scream: (id) sender; //Mail the scream - (IBAction) showPicker: (id)sender; - (void)displayComposerSheet; - (void)launchMailAppOnDevice; @end Here's the top of .m: #import "ScreamViewController.h" //top of code has Audio session callback function for responding to audio route changes (from Apple's code), then my code continues... @implementation ScreamViewController @synthesize appSoundPlayer; // AVAudioPlayer object for playing the selected scream @synthesize soundFileURL; // Path to the scream @synthesize interruptedOnPlayback; // Was application interrupted during audio playback @synthesize playing; // Track playing/not playing state @synthesize screamButton; //Press this button, girls scream. @synthesize screams; //Mutable array holding strings pointing to sound files of screams. @synthesize personScreaming; //Mutable array tracking the person doing the screaming @synthesize photoArray; //Mutable array holding strings pointing to photos of screaming girls @synthesize personScreamingField; //Field updates to announce which girl is screaming. @synthesize personScreamingImage; //Updates to show image of the screamer. @synthesize funnyButtonSayings; //Mutable array holding the sayings @synthesize funnyButtonSayingsLabel; //Label that updates with the funnyButtonSayings @synthesize mailStatusMessage; //did the email go out @synthesize shareButton; //share scream via email Next line begins the block with the offending code: - (IBAction) scream: (id) sender { //Play a click sound effect SystemSoundID soundID; NSString *sfxPath = [[NSBundle mainBundle] pathForResource:@"aClick" ofType:@"caf"]; AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath:sfxPath],&soundID); AudioServicesPlaySystemSound (soundID); // Because someone may slam the scream button over and over, //must stop current sound, then begin next if ([self appSoundPlayer] != nil) { [[self appSoundPlayer] setDelegate:nil]; [[self appSoundPlayer] stop]; [self setAppSoundPlayer: nil]; } //after selecting a random index in the array (did that in View Did Load), //we move to the next scream on each click. //First check... //Are we past the end of the array? if (currentScreamIndex == [screams count]) { currentScreamIndex = 0; } //Get the string at the index in the personScreaming array currentScream = [screams objectAtIndex: currentScreamIndex]; //Get the string at the index in the personScreaming array NSString *screamer = [personScreaming objectAtIndex:currentScreamIndex]; //Log the string to the console NSLog (@"playing scream: %@", screamer); // Display the string in the personScreamingField field NSString *listScreamer = [NSString stringWithFormat:@"scream by: %@", screamer]; [personScreamingField setText:listScreamer]; // Gets the file system path to the scream to play. NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: currentScream ofType: @"caf"]; // Converts the sound's file path to an NSURL object NSURL *newURL = [[NSURL alloc] initFileURLWithPath: soundFilePath]; self.soundFileURL = newURL; [newURL release]; [[AVAudioSession sharedInstance] setDelegate: self]; [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil]; // Registers the audio route change listener callback function AudioSessionAddPropertyListener ( kAudioSessionProperty_AudioRouteChange, audioRouteChangeListenerCallback, self ); // Activates the audio session. NSError *activationError = nil; [[AVAudioSession sharedInstance] setActive: YES error: &activationError]; // Instantiates the AVAudioPlayer object, initializing it with the sound AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: soundFileURL error: nil]; //Error check and continue if (newPlayer != nil) { self.appSoundPlayer = newPlayer; [newPlayer release]; [appSoundPlayer prepareToPlay]; [appSoundPlayer setVolume: 1.0]; [appSoundPlayer setDelegate:self]; //NEXT LINE IS FLAGGED BY INSTRUMENTS AS LEAKY [appSoundPlayer play]; playing = YES; //Get the string at the index in the photoArray array NSString *screamerPic = [photoArray objectAtIndex:currentScreamIndex]; //Log the string to the console NSLog (@"displaying photo: %@", screamerPic); // Display the image of the person screaming personScreamingImage.image = [UIImage imageNamed:screamerPic]; //show the share button shareButton.hidden = NO; mailStatusMessage.hidden = NO; mailStatusMessage.text = @"share!"; //Get the string at the index in the funnySayings array currentSayingsIndex = random() % [funnyButtonSayings count]; currentButtonSaying = [funnyButtonSayings objectAtIndex: currentSayingsIndex]; NSString *theSaying = [funnyButtonSayings objectAtIndex:currentSayingsIndex]; [funnyButtonSayingsLabel setText: theSaying]; currentScreamIndex++; } } Here's my dealloc: - (void)dealloc { [appSoundPlayer stop]; [appSoundPlayer release], appSoundPlayer = nil; [screamButton release], screamButton = nil; [mailStatusMessage release], mailStatusMessage = nil; [personScreamingField release], personScreamingField = nil; [personScreamingImage release], personScreamingImage = nil; [funnyButtonSayings release], funnyButtonSayings = nil; [funnyButtonSayingsLabel release], funnyButtonSayingsLabel = nil; [screams release], screams = nil; [personScreaming release], personScreaming = nil; [soundFileURL release]; [super dealloc]; } @end Thanks so much for reading this far! Any input appreciated.

    Read the article

  • How to open call out MKAnnotationView programmatically? (iPhone, MapKit)

    - by StijnSpijker
    I want to open up the callout for an MKPinAnnotationView programmatically. Eg I drop 10 pins on the map, and want to open up the one closest to me. How would I go about doing this? Apple has specified the 'selected' parameter for MKAnnotationView's, but discourages setting it directly (this doesn't work, tried it). For the rest MKAnnotationView only has a setHighlighted (same story), and canShowCallout method.. Any hints if this is possible at all?

    Read the article

< Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >