Search Results

Search found 2292 results on 92 pages for 'ian scho es'.

Page 18/92 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • iPhone Game Developers - What does your toolchain look like?

    - by slf
    For example: source control: git + adobe drive 3d: google sketchup - *.dae - blender - *.obj 2d: photoshop/illustrator - *.png audio: audacity - *.caf code: ArgoUML, Xcode, Textmate test: OCUnit build: rake, Xcode Feel free to mention any other tools that you think are awesome :) Changed to Community Wiki

    Read the article

  • Background problem of opengl 3d object over iphone camera view

    - by user292127
    Hi, I'm loading opengl 3d objects over the iphone camera view. When opengl view is loaded it's loading with a opengl 3d object with black background. The black background color will block the camera view.I just want to clear background color of opengl view so that I could load only the 3d object to the camera view. I had tried glclearcolor(1.0,1.0,1.0,0.0); but no change to background color. I had also tried to clear background color opengl view using [glview setbackgroundColor:[UIColor clearColor]];. No change in back ground color. Can any one help me with this stuff ? I'm new to opengl. Thanks in advance

    Read the article

  • ClassCaseException on GL11ExtensionPack. Rendering to texture on OpenGL android

    - by Joe
    I'm trying to render to a texture (really thought it would be easier than this!) I found this resource: which seems to be exactly what I want I'm getting a ClassCastException however, on GL11ExtensionPack gl11ep = (GL11ExtensionPack) gl; Can anyone tell me why? public void renderToTexture(GLRenderer glRenderer, GL10 gl) { boolean checkIfContextSupportsExtension = checkIfContextSupportsExtension(gl, "GL_OES_framebuffer_object"); if(checkIfContextSupportsExtension) { GL11ExtensionPack gl11ep = (GL11ExtensionPack) gl; int mFrameBuffer = createFrameBuffer(gl,texture.getWidth(), texture.getHeight(), texture.getGLID()); gl11ep.glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, mFrameBuffer); gl.glViewport(0, 0,texture.getWidth(), texture.getHeight()); gl.glLoadIdentity(); int halfWidth = texture.getWidth()/2;//width/2; int halfHeight = texture.getHeight()/2;//height/2; GLU.gluOrtho2D(gl, -halfWidth, halfWidth , -halfHeight, halfHeight); gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glLoadIdentity(); gl.glEnable(GL10.GL_TEXTURE_2D); Quad quad = new Quad(texture.getWidth(), texture.getHeight()); quad.setTexture(texture); SpriteRenderable sr = new SpriteRenderable(quad); //sr.setPosition(new Interpolation<Vector2>(new Vector2(-(float)texture.getWidth()/2f,-(float)texture.getHeight()/2f))); //sr.setPosition(new Interpolation<Vector2>(new Vector2((float)(texture.getWidth()/2f),0))); //sr.setPosition(new Interpolation<Vector2>(new Vector2(200,-200))); sr.setAngle(0.05f); sr.renderTo(glRenderer, gl, 1); for (Renderable renderable : renderThese) { if (renderable.isVisible()) { renderable.renderTo(glRenderer, gl, 1); } } gl11ep.glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, 0); } }

    Read the article

  • 3D Texture mapping

    - by Joe Cannatti
    In an .obj, file it is possible to specify 3 values for a vt line. vt 0.769645 0.729072 0.00000000 The .obj spec says its for "depth". What does this actually do and when is it useful?

    Read the article

  • EAGLView orientation changes and strange buffering

    - by Drew
    I'm writing an app that offloads some heavy drawing into an EAGLView, and it does some lightweight stuff in UIKit on top. It seems that the GL render during an orientation change is cached somewhere, and I can't figure out how to get access to it. Specifically: After an orientation change, calling glClear(GL_COLOR_BUFFER_BIT) isn't enough to clear the GL display (drawing is cached somewhere?) How can I clear this cache? After an orientation change, glReadPixel() can no longer access pixels drawn before the orientation change. How can I get access to where this is stored?

    Read the article

  • Why does UIImageView "darken"/saturate PNG images, and can I stop it?

    - by Ben
    I have a PNG file in a UIImageView, and next to that I have an EAGLView which displays the continuation of that same image (long story) as a texture, carved from the same original PNG. The point is, that these images, which should match up flawlessly, actually have somewhat differing color saturation. Normally I'd blame my handling of the PNG texture load in GL, but when I hold Preview (with the PNG) up to the iPhone simulator, it's GL that's spot on, and the UIImageView that's wrong! It's taken the image and made it ever-so-slightly more saturated. The image view is opaque with 100% alpha. I verified this on a clean UIImageView with another PNG file when put next to Preview. Anyone know what's up?

    Read the article

  • Trouble with Native OpenGL Renderer

    - by CaseyB
    I am using Native code to render OpenGL in Android and I get periodic errors that look like this: ERROR/IMGSRV(1435): frameresource.c:610: WaitUntilResourceIsNotNeeded: PVRSRVEventObjectWait failed ERROR/IMGSRV(1018): sgxif.c:124: WaitForRender: PVRSRVEventObjectWait failed ERROR/IMGSRV(1435): osfunc_um.c:318: PVRSRVEventObjectWait: Error 13 returned Once these errors come up I have to restart the phone or the rendering won't start again correctly. I have done a lot of web searching and I can't find out what could be the cause of these errors. Does anyone else have any suggestions?

    Read the article

  • Weird order when painting triangle outlines using GL_LINE_STRIP

    - by RayDeeA
    I'm developing an app for iOS-Plaftorms using OpenGL. Currently I'm having a weird issue when painting a plane (terrain) which consists of multiple subplanes, where each subplane consists of 2 triangles forming a rect. I'm painting this terrain as a wireframe by using a call to glDrawElements and provide the parameters GL_Line_Strip and the precalculated indices. The problem is that the triangles get painted in the wrong order or are rather vertically mirrored. They do not get painted in the order how I specified the indices, which is confusing. This is the simplified code to generate the vertices: for(NSInteger y = - gridSegmentsY / 2; y < gridSegmentsY / 2; y ++) { for(NSInteger x = - gridSegmentsX / 2; x < gridSegmentsX / 2; x ++) { vertices[pos++] = x * 5; vertices[pos++] = y * 5; vertices[pos++] = 0; } } This is how I generate the indices including degenerated ones (To use as IBO). pos = 0; for(int y = 0; y < gridSegmentsY - 1; y ++) { if (y > 0) { // Degenerate begin: repeat first vertex indices[pos++] = (unsigned short)(y * gridSegmentsY); } for(int x = 0; x < gridSegmentsX; x++) { // One part of the strip indices[pos++] = (unsigned short)((y * gridSegmentsY) + x); indices[pos++] = (unsigned short)(((y + 1) * gridSegmentsY) + x); } if (y < gridSegmentsY - 2) { // Degenerate end: repeat last vertex indices[pos++] = (unsigned short)(((y + 1) * gridSegmentsY) + (gridSegmentsX - 1)); } } So in this part... indices[pos++] = (unsigned short)((y * gridSegmentsY) + x); indices[pos++] = (unsigned short)(((y + 1) * gridSegmentsY) + x); ...I'm setting the first index in the indices array to point to the current (x,y) and the next index to (x,y+1). I'm doin' this for all x's in the current strip, then I'm handling degenerated triangles and repeat this procedure for the next strip (y+1). This method is taken from http://www.learnopengles.com/android-lesson-eight-an-introduction-to-index-buffer-objects-ibos/ So I expect the resulting mesh to get painted like: a----b----c | /| /| | / | / | | / | / | |/ |/ | d----e----f | /| /| | / | / | | / | / | |/ |/ | g----h----i by painting it as described using: glDrawElements(GL_LINE_STRIP, indexCount, GL_UNSIGNED_SHORT, 0); ...since I expect GL_Line_Strip to paint first a line from (a-d), then from (d-b), then (b, e)... and so on (as specified in the indices calculation) But what actually gets painted is: *----*----* |\ |\ | | \ | \ | | \ | \ | | \| \| *----*----* |\ |\ | | \ | \ | | \ | \ | | \| \| *----*----* So the triangles are somehow painted in the wrong order and I need to know why? ;). Does somebody know? Does the problem lie in using GL_Line_Strip or is there a bug in my code? My eye is at (0.0f, 0.0f, 20.0f) and looks at (0,0,0). The mesh is painted along the x-axis & y-axis from left to right with z = 0, so the mesh should not be flipped or anything.

    Read the article

  • using a texture mesh and wireframe mesh in threejs

    - by Andy Poes
    I'm trying to draw a wireframe mesh and a textured mesh in threeJS but when I have both added to my scene the textured mesh doesn't display. Code below: I'm having trouble creating two meshes that share the same geometry where one of the materials is wireframe and the other is a texture. If one of the materials is wireframe and the other is just a color fill it works fine- but as soon as I make the second material a texture it stops working. If I comment out scene.add( wireMesh ); then the textured mesh shows up. var wireMat = new THREE.MeshBasicMaterial( { color:0x00FFFF, wireframe: true, transparent: true, overdraw:true } ); var wireMesh = new THREE.Mesh(geometry, wireMat); scene.add( wireMesh ); var texture = texture = THREE.ImageUtils.loadTexture( 'textures/world.jpg' ); var imageMat = new THREE.MeshBasicMaterial( {color:0xffffff, map: texture } ); var fillMesh = new THREE.Mesh(geometry, imageMat); scene.add( fillMesh );

    Read the article

  • Help me start out with OpenGL

    - by Arun Thakkar
    Till today I am working with Basic UIKIT application but now onwards I need to work in OpenGL. Problem is I have not any idea about OpenGL and am confused lot about how to start and from where to start. I need to create an application which is same as "iBeer" (see movie in YouTube). So I am having lots of confusion about how do I create graphics of beer that you seen in application, so what should be preferred library?

    Read the article

  • Opengl Iphone SDK: How to tell if you're touching an object on screen?

    - by TheGambler
    First is my touchesBegan function and then the struct that stores the values for my object. I have an array of these objects and I'm trying to figure out when I touch the screen if I'm touching an object on the screen. I don't know if I need to do this by iterating through all my objects and figure out if I'm touching an object that way or maybe there is an easier more efficient way. How is this usually handled? -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ [super touchesEnded:touches withEvent:event]; UITouch* touch = ([touches count] == 1 ? [touches anyObject] : nil); CGRect bounds = [self bounds]; CGPoint location = [touch locationInView:self]; location.y = bounds.size.height - location.y; float xTouched = location.x/20 - 8 + ((int)location.x % 20)/20; float yTouched = location.y/20 - 12 + ((int)location.y % 20)/20; } typedef struct object_tag // Create A Structure Called Object { int tex; // Integer Used To Select Our Texture float x; // X Position float y; // Y Position float z; // Z Position float yi; // Y Increase Speed (Fall Speed) float spinz; // Z Axis Spin float spinzi; // Z Axis Spin Speed float flap; // Flapping Triangles :) float fi; // Flap Direction (Increase Value) } object;

    Read the article

  • Alternative of touchesMoved in Unity3D

    - by Arman
    - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint location = [touch locationInView:touch.view]; CGPoint xLocation = CGPointMake(location.x,racquet_yellow.center.y); racquet_yellow.center = xLocation; } The above event move recquet_yellow(UIImageView) with mouse pointer, or when I move thumb on iPhone screen, I have 3D Object in Unity3D how can I move my object like recquet_yellow. Kindly guide me.

    Read the article

  • EAGLContext, EAGLSharegroups, RenderBuffers, FrameBuffers, oh my!

    - by quixoto
    Hi all, I'm trying to wrap my head around the OpenGL object model on iPhone OS. I'm currently rendering into a few different UIViews (build on CAEAGLayers) on the screen. I currently have each of these as using separate EAGLContext, each of which has a color renderbuffer and a framebuffer. I'm rendering similar things in them, and I'd like to share textures between these instances to save memory overhead. My current understanding is that I could use the same setup (some number of contexts, each with a FBO/RBO) but if I spawn the later ones using the EAGLShareGroup of the first one, then I can simply use the texture names (GLuints) from the first one in the later ones. Is this accurate? If this is the case, I guess the followup question is: what's the benefit to having it be a "sharegroup"? Could I just reuse the same context, and attach multiple FBOs/RBOs to that context? I think I'm struggling with the abstraction layer of a sharegroup, which seems to share "objects" (textures and other named things) but not "state" (matrices, enabled/disabled states) which are owned by the context. What's the best way to think of this? Thanks for any enlightenment!

    Read the article

  • Detecting Touches in an OpenGL rendered scene

    - by Icky
    Hey. I was wondering whether there is a way to detect a touch in an OpenGL rendered scene. What I have i a set of images which are being rendered in my main view. Now if the user touches one of these images (or objects) I would like to know which one was touched - similar to the CGRectContainsPoint(frame, [touch locationInView:self.view] method. Is there an easy way to find out? If there is none, this would also help.

    Read the article

  • Bitmapfontatlas cocos2d performance issue

    - by meboz
    This question was also asked in the cocos2d forums but they are a little slower than here. Hi, Im trying to resolve a performance issue in my game. I have all of my game images on the one spritesheet. I now have a score label for which i have generated a font file with the Hiero tool. Ideally I'd like to update my score label with the current score on every update. However there seems to be a significant performance hit when doing so, which i believe is the result of the game images texture and the font texture swapping each update. Can anyone suggest a way to avoid this? Possibly by combining the font images with the game images in the one image file to avoid the texture swapping? Cheers

    Read the article

  • Understanding OpenGL Matrices

    - by Omega
    I'm starting to learn about 3D rendering and I've been making good progress. I've picked up a lot regarding matrices and the general operations that can be performed on them. One thing I'm still not quite following is OpenGL's use of matrices. I see this (and things like it) quite a lot: x y z n ------- 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 So my best understanding, is that it is a normalized (no magnitude) 4 dimensional, column-major matrix. Also that this matrix in particular is called the "identity matrix". Some questions: What is the "nth" dimension? How and when are these applied? My biggest confusion arises from how OpenGL makes use of this kind of data.

    Read the article

  • Simple Android OpenGL App Lag

    - by Eugene
    Hi, I have an Android OpenGL application which simply draws 2D squares (using 2 triangles) and animates them moving down the screen. I essentially do this by running: glLoadIdentity(), then glTranslatef, and finally glDrawElements all in a for loop. (The for loop is to draw all 10 blocks on the screen for every frame). In every drawFrame, the y-position of the blocks increments for the animation. The problem I'm having is strange. I run the application and the animation is laggy and not smooth. Then I re-run the application and I get a smooth animation. If I run again, I may get a smooth animation, or possibly not. Is my method correct, or is there a better way of doing this animation? Thanks for the help!

    Read the article

  • Porting WebGL game to iPhone's native OpenGL?

    - by ArtPulse
    We are developing a web game that uses WebGL for the two biggest parts of it. Working with HTML / CSS was too slow and too limited, so it's off the table. Thing is, iOS does not support WebGL publicly just yet, only on iAd. It is my guess Apple will eventually support it once the security issues they and Microsoft claim it has are fixed, and looks stable enough. Problem is, if Apple does not do this by the release of the next mayor iOS version, then we will have in our hands a mobile WebGL game that does not run. 6 months of development and testing to waste. So, questions: If that was the case, how viable (regarding amount of time) is it porting the WebGL part of the game to native iPhone OpenGL? I'm afraid that porting will take longer than the development of the game itself. I saw posts on Stack Overflow (like this) that suggested, on Android, adding the OpenGL interface manually to a WebKit element. It'd be slower than native. But either way... Is this something that could be accepted in the AppStore? Apple is very restrictive with these kind of stuff... Thank you all for your time!

    Read the article

  • Drawing individual pixels with iphone sdk.

    - by blob8108
    Hi, I've been trying to figure out how to make a powder toy style game on the iPhone. My problem is how to draw pixels to the screen. From what I've read, OpenGL is better for games as it is faster/hardware accelerated, but there is no method to draw pixels directly to the screen. Apparently drawing pixels to an off-screen frame buffer is the way to go, but how do I then pass this to OpenGL? Do I use a texture? (this is assuming I have no previous knowledge of iPhone graphics programming). Thanks!

    Read the article

  • NAN mixing float and GLFloat?

    - by carrots
    This often returns NAN ("Not A Number") depending on input: #define PI 3.1415f GLfloat sineEaseIn(GLfloat ratio) { return 1.0f-cosf(ratio * (PI / 2.0f)); } I tried making PI a few digits smaller to see if that would help. No dice. Then I thought it might be a datatype mismatch, but float and glfloat seem to be equivalent: gl.h typedef float GLfloat; math.h extern float cosf( float ); Is this a casting issue?

    Read the article

  • 2D distortion of a face from an image on iOS? (similar to Fat Booth etc.)

    - by Dominik Hadl
    I was just wondering if someone knows about some good library or tutorial on how to achieve a 2D distortion of a face taken from an image taken by the user. I would like to achieve a similar effect to the one in Fatify, Oldify, all those Fat Booths, etc., because I am creating an app where you will throw something at the face and I would the face to jiggle and move when the object hits it. How should I do this?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >