Search Results

Search found 3627 results on 146 pages for 'opengl es'.

Page 29/146 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • Drawing unfilled rectangle shape in c++ openGL

    - by Bahaa
    I want to draw unfilled rectangle shape in openGL using c++ programming language but when I used the glBegin(GL_QUADS) or glBegin(GL_POLYGON), the resulted shape is filled but I want to be unfilled. How I can draw unfilled rectangle. void draweRect(void) { glClear(GL_COLOR_BUFFER_BIT); glColor3f(0.0,0.0,1.0); glLineWidth(30); glBegin(GL_POLYGON); glVertex2i(50,90); glVertex2i(100,90); glVertex2i(100,150); glVertex2i(50,150); glEnd(); glFlush(); }

    Read the article

  • OpenGL, draw two polygons in the same time (by mouse clicks)

    - by YoungSalafi
    im trying to draw 2 polygons at the same time depending on user input from the opengl screen... so i made 2 arrays which each one of them will carry the vertices of each polygon ... i think my logic is right but the program still prints only polygon and delete the old polygon if you draw a polygon again . and its acting weird too please check the code yourself here it is : P.S dont mind the delete function right now.. i know it missing something. #include <windows.h> #include <gl/gl.h> #include <gl/glut.h> void Draw(); void Set_Transformations(); void Initialize(int argc, char *argv[]); void OnKeyPress(unsigned char key, int x, int y); void DeleteVer(); void MouseClick(int bin, int state , int x , int y); void GetOGLPos(int x, int y,float* arrY,float* arrX); void DrawPolygon(float* arrX,float* arrY); float xPos[20]; float yPos[20]; float xPos2[20]; float yPos2[20]; float fx = 0,fy = 0; float size = 10; int count = 0; bool done = false; bool flag = true; void Initialize(int argc, char *argv[]) { glutInit(&argc, argv); glutInitDisplayMode(GLUT_RGBA); glutInitWindowPosition(100, 100); glutInitWindowSize(600, 600); glutCreateWindow("OpenGL Lab1"); Set_Transformations(); glutDisplayFunc(Draw); glutMouseFunc(MouseClick); glutKeyboardFunc(OnKeyPress); glutMainLoop(); } void Set_Transformations() { glClearColor(1, 1, 1, 1); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluOrtho2D(-200, 200, -200, 200); } void OnKeyPress(unsigned char key, int x, int y) { if (key == 27) exit(0); switch(key) { case 13: //enter key it will draw done = true; glutPostRedisplay(); flag=!flag; // this flag to switch to the other array that the vertices will be stored in, in order to draw the second polygon break; } } void MouseClick(int button, int state , int x , int y) { switch (button) { case GLUT_RIGHT_BUTTON: if (state == GLUT_DOWN) { if (count>0) { DeleteVer(); //dont mind this right now } } break; case GLUT_LEFT_BUTTON: if (state == GLUT_DOWN) { if(count<20) { if(flag =true){ // drawing first polygon GetOGLPos(x, y,xPos,yPos);} if (flag=false) //drawing second polygon after Enter is pressed GetOGLPos(x, y,xPos2,yPos2); } } break; } } void GetOGLPos(int x, int y,float* arrY,float* arrX) //getting the vertices from the user { GLint viewport[4]; GLdouble modelview[16]; GLdouble projection[16]; GLfloat winX, winY, winZ; GLdouble posX, posY, posZ; glGetDoublev( GL_MODELVIEW_MATRIX, modelview ); glGetDoublev( GL_PROJECTION_MATRIX, projection ); glGetIntegerv( GL_VIEWPORT, viewport ); winX = (float)x; winY = (float)viewport[3] - (float)y; glReadPixels( x, int(winY), 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &winZ ); gluUnProject( winX, winY, winZ, modelview, projection, viewport, &posX, &posY, &posZ); arrX[count] = posX; arrY[count] = posY; count++; glPointSize( 6.0 ); glBegin(GL_POINTS); glVertex2f(posX,posY); glEnd(); glFlush(); } void DeleteVer(){ //dont mind this glColor3f ( 1, 1, 1); glBegin(GL_POINTS); glVertex2f(xPos[count-1],yPos[count-1]); glEnd(); glFlush(); xPos[count] = NULL; yPos[count] = NULL; count--; glColor3f ( 0, 0, 0); } void DrawPolygon(float* arrX,float* arrY) { int n=0; glColor3f ( 0, 0, 0); glBegin(GL_POLYGON); while(n<count) { glVertex2f(arrX[n],arrY[n]); n++; } count=0; glEnd(); glFlush(); } void Draw() //main drawing func { glClear(GL_COLOR_BUFFER_BIT); glColor3f(0, 0, 0); if(done) { DrawPolygon(xPos,yPos); DrawPolygon(xPos2,yPos2); } glFlush(); } int main(int argc, char *argv[]) { Initialize(argc, argv); return 0; }

    Read the article

  • OpenGL multitexture tessellation

    - by user1715296
    I have to tessellate some surface in OpenGL with rectangular textures. Let it be a single triangle for simplicity. The textures touch each other by sides, and do not overlap. That is done by setting GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T to GL_CLAMP_TO_BORDER and adjusting texture coords properly. Everything goes fine while GL_TEXTURE_MIN_FILTER and GL_TEXTURE_MAG_FILTER is set to GL_NEAREST, but when I want to apply GL_LINEAR filering and/or anisotropic filtering following arifact apperas: textures border pixel's alpha gradually fall to transparent, so that line of background color is visible between neighbouring textures. How can I avoid this artifact without merging multiple textures to one while linear filtering is preserved?

    Read the article

  • GLSL vertex shaders with movements vs vertex off the screen

    - by user827992
    If i have a vertex shader that manage some movements and variations about the position of some vertex in my OpenGL context, OpenGL is smart enough to just run this shader on only the vertex visible on the screen? This part of the OpenGL programmable pipeline is not clear to me because all the sources are not really really clear about this, they talk about fragments and pixels and I get that, but what about vertex shaders? If you need a reference i'm reading from this right now and this online book has a couple of examples about this.

    Read the article

  • Problems in exporting terrain from autodesk 3ds

    - by Jatin Kumar
    i am trying to make small counter strike sort of game and for the terrain part i have exported the terrain in 3ds format from Autodesk 3ds-max and imported the same in opengl using lib3ds. Its working fine but with few problems: The terrain is mainly made up of some cubical boxes with texture on them and placed on a big flat surface with boundary wall. In opengl i have enabled anti aliasing but still there is too much aliasing on the boundaries (visible when rotating the camera). I have tiled the floor with some image but in opengl it is just the single image stretched over the complete surface. I have exported animated model (Skelton+mesh+material+animation) from 3ds and used cal3d library for reading the same. Model has a gun also which is not appearing in opengl and it too has too much of aliasing problem. I have googled around but couldn't find any relevant solutions. Thanks in advance

    Read the article

  • Complete Guide/Tutorials on LWJGL?

    - by user43353
    Dont get me wrong, I finished these tutorials on http://lwjgl.org/wiki/index.php?title=Main_Page. I finished The Basics section, OpenGL 3.2 and newer section, and I looked at the Example Code section. They were great tutorials, and I have looked at the external tutorials as well. I don't know where to go from here, and OpenGL is not my strong point. Some one suggested Learning Modern 3D Graphics Programming, and I didnt learn much. I looked at the port to LWJGL, but the book was on C and I couldn't really understand what the OpenGL meant. I am trying to learn 2D gaming, not 3D. Maybe later. Is there any tutorials that aren't C/C++ heavy and teach you 2D OpenGL?

    Read the article

  • Why won't my vertex buffer render in GLFW3?

    - by sm81095
    I have started to try to learn OpenGL, and I decided to use GLFW to assist in window creation. The problem is, since GLFW3 is so new, there are no tutorials on it or how to use it with modern OpenGL (3.3, specifically). Using the GLFW3 tutorial found on the website, which uses older OpenGL rendering (glBegin(GL_TRIANGLES), glVertex3f(), and such), I can get a triangle to render to the screen. The problem is, using new OpenGL, I can't get the same triangle to render to the screen. I am new to OpenGL, and GLFW3 is new to most people, so I may be completely missing something obvious, but here is my code: static const GLuint g_vertex_buffer_data[] = { -1.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f }; int main(void) { GLFWwindow* window; if(!glfwInit()) { fprintf(stderr, "Failed to initialize GLFW."); return -1; } glfwWindowHint(GLFW_SAMPLES, 4); glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3); glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); window = glfwCreateWindow(800, 600, "Test Window", NULL, NULL); if(!window) { glfwTerminate(); fprintf(stderr, "Failed to create a GLFW window"); return -1; } glfwMakeContextCurrent(window); glewExperimental = GL_TRUE; GLenum err = glewInit(); if(err != GLEW_OK) { glfwTerminate(); fprintf(stderr, "Failed to initialize GLEW"); fprintf(stderr, (char*)glewGetErrorString(err)); return -1; } GLuint VertexArrayID; glGenVertexArrays(1, &VertexArrayID); glBindVertexArray(VertexArrayID); GLuint programID = LoadShaders("SimpleVertexShader.glsl", "SimpleFragmentShader.glsl"); GLuint vertexBuffer; glGenBuffers(1, &vertexBuffer); glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer); glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW); while(!glfwWindowShouldClose(window)) { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glUseProgram(programID); glEnableVertexAttribArray(0); glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*)0); glDrawArrays(GL_TRIANGLES, 0, 3); glDisableVertexAttribArray(0); glfwSwapBuffers(window); glfwPollEvents(); } glDeleteBuffers(1, &vertexBuffer); glDeleteProgram(programID); glfwDestroyWindow(window); glfwTerminate(); exit(EXIT_SUCCESS); } I know it is not my shaders, they are super simple and I've checked them against GLFW 2.7 so I know that they work. I'm assuming that I've missed something crucial to using the OpenGL context with GLFW3, so any help locating the problem would be greatly appreciated.

    Read the article

  • Is there an alternative to SDL 1.3 for a C++ game that should run on iOS and Android?

    - by futlib
    I've used SDL for many desktop games, always as the cross-platform glue for: Creating a window Processing input Rendering images Rendering fonts Playing sounds/music It has never disappointed me at those tasks. But when it comes to graphics, I prefer to work with the OpenGL API directly, even though all of our games are 2D. In the project I'm currently working on, I've made sure to only use the API subset supported by both OpenGL 1.3 and OpenGL 1.0, so making the thing run on Android should be easy, I thought. Turns out there is no official Android or iOS port of SDL yet. However, there's one in SDL 1.3, which is still in development. SDL 1.3 doesn't seem very appealing to me for three reasons: It's been in development for at least 4 years, and I have no idea when it will be done, not to mention stable. It's not ported to as many platforms as SDL 1.2. From what I've seen, it uses OpenGL for drawing, so I suppose the community will move away from directly using OpenGL. So I'm wondering if I should use a different library for our current project - it doesn't matter much if I need to port my existing code from SDL 1.2 to SDL 1.3 or to some other library. We're planning to release on Windows, Mac OS X, Linux, iOS and Android, so good support for these platforms is essential. Is there anything stable that does what I want?

    Read the article

  • What are the factors that determine the default frequency of a shader call?

    - by user827992
    After i have been played for some days with various vertex and fragments shaders seems clear to me that this programs are called by the GPU at every and each rendering cycle, the problem is that I can't really quantify this frequency and I can't tell if is based on some default values or not because I don't have a big collection of hardware right now to do extensive tests. For what i know the answer could be really trivial like "it's the same of the refresh rate of your monitor", but i would like some good answers on that to be clear on this. For instance looks really odd to me that all the techniques used to control the amount of FPS that i have seen until now uses a call for the OpenGL function glutGet(GLUT_ELAPSED_TIME) to retrieve a value in ms about when the rendering started but I have to relies on the CPU to do the math. Why I can't set an FPS value in OpenGL if OpenGL clearly has a counter and a timer/clock? PS I'm referring to OpenGL 3.0+

    Read the article

  • Use Android NDK for portability with iOS?

    - by J-F L-R
    I am currently planning to implement a little painting app using OpenGL ES 1.1. I believe this question applies to any OpenGL ES project. I am starting development on Android and I would like to know if you would recommend writing the drawing logic (using OpenGL) in C++ with the NDK so it will easier to port to iOS, or to use the Java API and being locked on Android. The reason I am asking that is because I have seen mixed opinions on the Web about using the NDK (some people say it is an added level of complexity). From what I have already seen, I believe that I should go with the Java API since I am starting on Android and then, if I decide to go on iOS, to rewrite the OpenGL logic in Objective-C or C++. This should be pretty straightforward since the calls appear to be the same in both languages. What do you think? Am I right?

    Read the article

  • Flash like animation editing and container format for OpenGL environment?

    - by tbarbe
    Are there ANY tools that lets an animator / designer create scripted animations that can export to an OpenGL compatible format -- that are similar to the timeline editing in Flash or After Effects? Does OpenGL ES have some kind of animation playback or container format? ( is there something similar to .swf for OpenGL? ) Im looking for something that lets a designer / animator do his work with a timeline and in a traditional animation environment... meanwhile still having integration with OpenGL.

    Read the article

  • How to draw an Arc in OpenGL

    - by rpgFANATIC
    While making a little Pong game in C++ OpenGL, I decided it'd be fun to create arcs (semi-circles) when stuff bounces. I decided to skip Bezier curves for the moment and just go with straight algebra, but I didn't get far. My algebra follows a simple quadratic function (y = +- sqrt(mx+c)). This little excerpt is just an example I've yet to fully parameterize, I just wanted to see how it would look. When I draw this, however, it gives me a straight vertical line where the line's tangent line approaches -1.0 / 1.0. Is this a limitation of the GL_LINE_STRIP style or is there an easier way to draw semi-circles / arcs? Or did I just completely miss something obvious? void Ball::drawBounce() { float piecesToDraw = 100.0f; float arcWidth = 10.0f; float arcAngle = 4.0f; glBegin(GL_LINE_STRIP); for (float i = 0.0f; i < piecesToDraw; i += 1.0f) // Positive Half { float currentX = (i / piecesToDraw) * arcWidth; glVertex2f(currentX, sqrtf((-currentX * arcAngle)+ arcWidth)); } for (float j = piecesToDraw; j > 0.0f; j -= 1.0f) // Negative half (go backwards in X direction now) { float currentX = (j / piecesToDraw) * arcWidth; glVertex2f(currentX, -sqrtf((-currentX * arcAngle) + arcWidth)); } glEnd(); } Thanks in advance.

    Read the article

  • Should I use OpenGL for chess with animations?

    - by fhucho
    At the moment I am experimenting with SurfaceView for my chess game with animations. I am getting only about 8 FPS in the emulator. I draw a chess board and 32 chess pieces and rotate everything (to see how smooth it is), I am using antialiasing. On the Droid I'm getting about 20FPS, so it's not very smooth. Is it possible to implement a game with very scarce and simple animations without having to use OpenGL? This is what I do every frame: // scale and rotate matrix.setScale(scale, scale); rotation += 3; matrix.postRotate(rotation, 152, 152); canvas = surfaceHolder.lockCanvas(); canvas.setDrawFilter(new PaintFlagsDrawFilter(0, Paint.FILTER_BITMAP_FLAG)); canvas.setMatrix(matrix); canvas.drawARGB(255, 255, 255, 255); // fill the canvas with white for (int i = 0; i < sprites.size(); i++) { sprites.get(i).draw(canvas); // draws chessboard and chess pieces }

    Read the article

  • SDL_image/C++ OpenGL Program: IMG_Load() produces fuzzy images

    - by Kami
    I'm trying to load an image file and use it as a texture for a cube. I'm using SDL_image to do that. I used this image because I've found it in various file formats (tga, tif, jpg, png, bmp) The code : SDL_Surface * texture; //load an image to an SDL surface (i.e. a buffer) texture = IMG_Load("/Users/Foo/Code/xcode/test/lena.bmp"); if(texture == NULL){ printf("bad image\n"); exit(1); } //create an OpenGL texture object glGenTextures(1, &textureObjOpenGLlogo); //select the texture object you need glBindTexture(GL_TEXTURE_2D, textureObjOpenGLlogo); //define the parameters of that texture object //how the texture should wrap in s direction glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); //how the texture should wrap in t direction glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); //how the texture lookup should be interpolated when the face is smaller than the texture glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); //how the texture lookup should be interpolated when the face is bigger than the texture glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); //send the texture image to the graphic card glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->w, texture->h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture-> pixels); //clean the SDL surface SDL_FreeSurface(texture); The code compiles without errors or warnings ! I've tired all the files formats but this always produces that ugly result : I'm using : SDL_image 1.2.9 & SDL 1.2.14 with XCode 3.2 under 10.6.2 Does anyone knows how to fix this ?

    Read the article

  • Opengl Triangle instead of square

    - by Dave
    Im trying to create a spinning square inside of xcode using opengl but instead for some reason I have a spinning triangle? I'm doing this inside of sio2 but I dont think this is the problem. Here is the triangle: http://img220.imageshack.us/img220/7051/snapzproxscreensnapz001.png Here is my code: void templateRender( void ) { const GLfloat squareVertices[] ={ 100.0f, -100.0f, 100.0f, -100.0f, -100.0f, 100.0f, 100.0f, 100.0f, }; const unsigned char squareColors[] = { 255, 255, 0, 255, 0, 255, 255, 255, 0, 0, 0, 0, 255, 0, 255, 255, }; glMatrixMode( GL_MODELVIEW ); glLoadIdentity(); glClear( GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT ); // Your rendering code here... sio2WindowEnter2D( sio2->_SIO2window, 0.0f, 1.0f ); { glVertexPointer( 2, GL_FLOAT, 0, squareVertices ); glEnableClientState(GL_VERTEX_ARRAY); //set up the color array glColorPointer( 4, GL_UNSIGNED_BYTE, 0, squareColors ); glEnableClientState( GL_COLOR_ARRAY ); glTranslatef( sio2->_SIO2window->scl->x * 0.5f, sio2->_SIO2window->scl->y * 0.5f, 0.0f ); static float rotz = 0.0f; glRotatef( rotz, 0.0f, 0.0f, 1.0f ); rotz += 90.0f * sio2->_SIO2window->d_time; glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); } sio2WindowLeave2D(); }

    Read the article

  • OpenGL bitmap text fails after drawing polygon

    - by kaykun
    I'm using Win32 and OpenGL to to draw text onto a window. I'm using the bitmap font method, with wglUseFontBitmaps. Here is my main rendering function: glClear(GL_COLOR_BUFFER_BIT); glPushMatrix(); glColor3f(1.0f, 0.0f, 1.0f); glBegin(GL_QUADS); glVertex2f(0.0f, 0.0f); glVertex2f(128.0f, 0.0f); glVertex2f(128.0f, 128.0f); glVertex2f(0.0f, 128.0f); glEnd(); glPopMatrix(); glPushMatrix(); glColor3f(1.0f, 1.0f, 1.0f); glRasterPos2i(200, 200); glListBase(fontList); glCallLists(5, GL_UNSIGNED_BYTE, "Test."); glPopMatrix(); SwapBuffers(hDC); As you can see it's very simple and the only thing that it's supposed to do is draw a quadrilateral and draw the text "Test.". But the problem is that drawing a polygon seems to mess up any text operations I try to do after it. If I place the text drawing functions before the polygon, both the text and the polygon draw fine. Is there something I'm missing here? Edit: This problem only happens when the window is run in Fullscreen, by ChangeDisplaySettings. Any reason why this would be??

    Read the article

  • OpenGl texture mapping blocking colours on FreeType?

    - by Dororo
    I'm using FreeType in order to allow fonts to be used in OpenGL. However, I'm having a problem where I cannot change the font colour whenever I do texture mapping. No matter what I select using glColor3f it will just come out white. The texture works fine. glClear(GL_COLOR_BUFFER_BIT); glLoadIdentity(); glColor3f(0.5,0.0,0.5); glPushMatrix(); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glEnable(GL_TEXTURE_2D); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); glBindTexture(GL_TEXTURE_2D, texName); glBegin(GL_POLYGON); glTexCoord2f(0,1); glVertex2f(-16,-16); glTexCoord2f(0,0); glVertex2f(-16,16); glTexCoord2f(1,0); glVertex2f(16,16); glTexCoord2f(1,1); glVertex2f(16,-16); glEnd(); glDisable(GL_TEXTURE_2D); glDisable(GL_BLEND); glPopMatrix(); glColor3f(1,0,0); print(our_font, -300+screenWidth/2.0, screenHeight/2.0, "fifty two - %7.2f", spin); This is the problem code, I can confirm that drawing a polygon beneath this code will indeed make it red. The text is not changing to red though which it should; if you remove the texture mapping above it will turn red again, I can only think it is a problem with enabling and disabling and I've forgotten to do something...?

    Read the article

  • Qt 5.3 OpenGL - vertex buffer object drawing using the core profile

    - by user3700881
    Im using Qt 5.3 to create a QWindow to do some basic rendering stuff. The QWindow is declared like this: class OpenGLWindow : public QWindow, protected QOpenGLFunctions_3_3_Core { Q_OBJECT ... } It is initialized in the constructor: OpenGLWindow::OpenGLWindow(QWindow *parent) : QWindow(parent) { QSurfaceFormat format; format.setVersion(3,3); format.setProfile(QSurfaceFormat::CoreProfile); this->setSurfaceType(OpenGLSurface); this->setFormat(format); this->create(); _context = new QOpenGLContext; _context->setFormat(format); _context->create(); _context->makeCurrent(this); this->initializeOpenGLFunctions(); ... } And that's the rendering code: void OpenGLWindow::render() { if(!isExposed()) return; _context->makeCurrent(this); glClear(GL_COLOR_BUFFER_BIT); glUseProgram(_shaderProgram); glBindBuffer(GL_ARRAY_BUFFER, _positionBufferObject); glEnableVertexAttribArray(0); glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 0, 0); glDrawArrays(GL_TRIANGLES, 0, 3); glDisableVertexAttribArray(0); glUseProgram(0); _context->swapBuffers(this); } I am trying to draw a simple triangle using a vertex and fragment shader. The problem is that the triangle is not showing up when the core profile is set. Only when I set the OpenGL version to 2.0 or when I use the compatibility profile, it shows up. From my point of view that doesn't make any sense because I am not using fixed functionality at all. What am I missing?

    Read the article

  • OpenGL depth buffer on Android

    - by kayahr
    I'm currently learning OpenGL ES programming on Android (2.1). I started with the obligatory rotating cube. It's rotating fine but I can't get the depth buffer to work. The polygons are always displayed in the order the GL commands render them. I do this during initialization of GL: gl.glClearColor(.5f, .5f, .5f, 1); gl.glShadeModel(GL10.GL_SMOOTH); gl.glClearDepthf(1f); gl.glEnable(GL10.GL_DEPTH_TEST); gl.glDepthFunc(GL10.GL_LEQUAL); gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST); On surface-change I do this: gl.glViewport(0, 0, width, height); gl.glMatrixMode(GL10.GL_PROJECTION); gl.glLoadIdentity(); GLU.gluPerspective(gl, 45.0f, (float) width / (float) height, 0.1f, 100f); When I enable backface culling then everything looks correct. But backface culling is only a speed-optimization so it should also work with only the depth buffer or not? So what is missing here?

    Read the article

  • opengl color quadrangle

    - by Tyzak
    hello i try out opengl. i have a programm that creates a black border an white corner (quadrangle). now i want to make the corner of the quadrangle in an different color. i don't know where exactly to write the code, and i don't know much a but color4f, i searcherd on google, but didn't get it. (is there a good description somewhere?) #include <iostream> #include <GL/freeglut.h> void Init() { glColor4f(100,0,0,0); } void RenderScene() //Zeichenfunktion { glLoadIdentity (); glBegin( GL_POLYGON ); glVertex3f( -0.5, -0.5, -0.5 ); glVertex3f( 0.5, -0.5, -0.5 ); glVertex3f( 0.5, 0.5, -0.5 ); glVertex3f( -0.5, 0.5, -0.5 ); glEnd(); glFlush(); } void Reshape(int width,int height) { } void Animate (int value) { std::cout << "value=" << value << std::endl; glutPostRedisplay(); glutTimerFunc(100, Animate, ++value); } int main(int argc, char **argv) { glutInit( &argc, argv ); // GLUT initialisieren glutInitDisplayMode( GLUT_RGB ); // Fenster-Konfiguration glutInitWindowSize( 600, 600 ); glutCreateWindow( "inkrement screen; visual screen" ); // Fenster-Erzeugung glutDisplayFunc( RenderScene ); // Zeichenfunktion bekannt machen glutReshapeFunc( Reshape ); glutTimerFunc( 10, Animate, 0); Init(); glutMainLoop(); return 0; }

    Read the article

  • OpenGL bitmap text fails after drawing polygon

    - by kaykun
    Hi, I'm using Win32 and OpenGL to to draw text onto a window. I'm using the bitmap font method, with wglUseFontBitmaps. Here is my main rendering function: glClear(GL_COLOR_BUFFER_BIT); glPushMatrix(); glColor3f(1.0f, 0.0f, 1.0f); glBegin(GL_QUADS); glVertex2f(0.0f, 0.0f); glVertex2f(128.0f, 0.0f); glVertex2f(128.0f, 128.0f); glVertex2f(0.0f, 128.0f); glEnd(); glPopMatrix(); glPushMatrix(); glColor3f(1.0f, 1.0f, 1.0f); glRasterPos2i(200, 200); glListBase(fontList); glCallLists(5, GL_UNSIGNED_BYTE, "Test."); glPopMatrix(); SwapBuffers(hDC); As you can see it's very simple and the only thing that it's supposed to do is draw a quadrilateral and draw the text "Test.". But the problem is that drawing a polygon seems to mess up any text operations I try to do after it. If I place the text drawing functions before the polygon, both the text and the polygon draw fine. Is there something I'm missing here? Any help is appreciated.

    Read the article

  • How to setup OpenGL camera for a racing game

    - by vian
    I need the view to show the road polygon (a rectangle 3.f * 100.f) with a vanishing point for a road being at 3/4 height of the viewport and the nearest road edge as a viewport's bottom side. See Crazy Taxi game for an example of what I wish to do. I'm using iPhone SDK 3.1.2 default OpenGL ES project template. I setup the projection matrix as follows: glMatrixMode(GL_PROJECTION); glLoadIdentity(); glFrustumf(-2.25f, 2.25f, -1.5f, 1.5f, 0.1f, 1000.0f); Then I use glRotatef to adjust for landscape mode and setup camera. glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glRotatef(-90, 0.0f, 0.0f, 1.0f); const float cameraAngle = 45.0f * M_PI / 180.0f; gluLookAt(0.0f, 2.0f, 0.0f, 0.0f, 0.0f, 100.0f, 0.0f, cos(cameraAngle), sin(cameraAngle)); My road polygon triangle strip is like this: static const GLfloat roadVertices[] = { -1.5f, 0.0f, 0.0f, 1.5f, 0.0f, 0.0f, -1.5f, 0.0f, 100.0f, 1.5f, 0.0f, 100.0f, }; And I can't seem to find the right parameters for gluLookAt. My vanishing point is always at the center of the screen.

    Read the article

  • Orthogonal projection and texture coordinates in opengl

    - by knuck
    I'm writing a 2D game in Opengl. I already set up the orthogonal projection so I can easily know where a quad will end up on screen. The problem is, I also want to be able to map pixels directly to texture coords, so I also applied an orthogonal transformation (using gluOrtho2d) to the texture. Now I can map pixels directly using integers and glTexCoord2i. The thing is, after googling/reading/asking, I found out no one really knows (apparently) the behavior of glTexCoord2i, but it works just fine the way I'm using. Some sample test code I wrote follows: glBegin(GL_QUADS); glTexCoord2i(16,0); glVertex2f(X, Y); glTexCoord2i(16,16); glVertex2f(X, Y+32); glTexCoord2i(32, 16); glVertex2f(X+32, Y+32); glTexCoord2i(32, 0); glVertex2f(X+32, Y); glEnd(); So, is there any problem with what I'm doing, or is what I'm doing correct?

    Read the article

  • OpenGL equivalent of GDI's HatchBrush or PatternBrush?

    - by Ptah- Opener of the Mouth
    I have a VB6 application (please don't laugh) which does a lot of drawing via BitBlt and the standard VB6 drawing functions. I am running up against performance issues (yes, I do the regular tricks like drawing to memory). So, I decided to investigate other ways of drawing, and have come upon OpenGL. I've been doing some experimenting, and it seems straightforward to do most of what I want; the application mostly only uses very simple drawing -- relatively large 2D rectangles of solid colors and such -- but I haven't been able to find an equivalent to something like a HatchBrush or PatternBrush. More specifically, I want to be able to specify a small monochrome pixel pattern, choose a color, and whenever I draw a polygon (or whatever), instead of it being solid, have it automatically tiled with that pattern, not translated or rotated or skewed or stretched, with the "on" bits of the pattern showing up in the specified color, and the "off" bits of the pattern left displaying whatever had been drawn under the area that I am now drawing on. Obviously I could do all the calculations myself. That is, instead of drawing as a polygon which will somehow automatically be tiled for me, I could calculate all of the lines or pixels or whatever that actually need to be drawn, then draw them as lines or pixels or whatever. But is there an easier way? Like in GDI, where you just say "draw this polygon using this brush"? I am guessing that "textures" might be able to accomplish what I want, but it's not clear to me (I'm totally new to this and the documentation I've found is not entirely obvious); it seems like textures might skew or translate or stretch the pattern, based upon the vertices of the polygon? Whereas I want the pattern tiled. Is there a way to do this, or something like it, other than brute force calculation of exactly the pixels/lines/whatever that need to be drawn? Thanks in advance for any help.

    Read the article

  • OpenGL pixels drawn with each horizontal pair swapped

    - by Tim Kane
    I'm somewhat new to OpenGL though I'm fairly sure my problem lies in the pixel format being used, or how my texture is being generated... I'm drawing a texture onto a flat 2D quad using a 16bit RGB5_A1 pixel format, though I don't make use of any alpha at this stage. The problem I'm having is that each pair of horizontal pixel values have been swapped. That is... if the pixels positions should be in this order (assume 8x2 image) 0 1 2 3 4 5 6 7 they are instead drawn as 1 0 3 2 5 4 7 6 Or, more clearly from this image (below). Left is what I get... Right is what I should get. . The question is... How have I ended up with this? Is there something wrong with the pixel format? Unlikely since the colours all appear correct, and I would expect all kinds of nasty if it were down to endian-ness. Suggestions greatly appreciated. Update: Turns out the problem was in my source renderer. Interestingly, I've avoided the problem entirely by using 32-bit textures (haven't tried 24-bit at this point).

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >