Search Results

Search found 3875 results on 155 pages for 'opengl es lighting'.

Page 39/155 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Android dev platform supporting OpenGL ES 2.0: Where to buy?

    - by pixelpush
    I plan to port some camera and multimedia algorithms and functionality on a Qualcomm Snapdragon platform running Android. I need OpenGL ES 2.0 acceleration for many algorithms. Which platform is the right one? Also, where can I purchase this? The Android dev platform on Google's website supports on OpenGL ES 1.x Thanks for any input.

    Read the article

  • Does use of simple shaders improve performace/battery life?

    - by Miro
    I'm making OpenGL game for Android. Till now i've used only fixed function pipeline, but i'm rendering simple things. Fixed function pipeline includes a lot of stuff i don't need. So i'm thinking about implementing shaders in my game to simplify OpenGL pipeline if it can make better performance. Better performance = better battery life, unless fps is limited by software limit, not hardware power.

    Read the article

  • Loading and drawing materials using Lib3ds

    - by Dfowj
    Hey all, i'm currently using Lib3ds to load models into my C++/OpenGL project. So far, i've been follow the model loading tutorial found here. The tutorial gives a good example of how to draw the vertices and normals using VBO's, but so far i've been lost as how to do the same thing with materials. Could i get an explanation/example of how to both load and draw materials of my meshes using Lib3ds and OpenGL?

    Read the article

  • How to add two textures ,one is used as background and another one is used in a rotating cube!

    - by VampirEMufasa
    I am working in OpenGL ES 2.0. Now I am writing a demo for my project, I load two png images as my textures with the libSOIL But now I need to use one of them as the texture of my demo's background and another one as the texture of a rotating cube. In OpenGL ES 2.0, the adding texture operation is in the shader But now I don't know how to add the different textures to the different place in a shader Who can help me! Thank you very much!

    Read the article

  • Should I use SDL_Surface or SDL_Window? [on hold]

    - by The Light Spark
    I am making an OpenGL game basically. I have just started out on the territory. I have seen tutorials which use an SDL_Surface for rendering to while other tutorials use SDL_Window and obtain an openGL context from that and render to that, with no mention of surfaces. I understand what the differences between the two are, but is there any advantage in using one over the other? Can I use the SDL_Window technique to create high quality games or does the SDL_Surface approach get me better results?

    Read the article

  • Setting Krypton Light to Screen Pixels

    - by Adam Jerrett
    So a few days back, I started playing around with Krypton XNA for 2D lighting in my game. I noticed in general, that spawning a light at (0,0) with Krypton causes the light to appear in, pretty much, the centre of the game screen. Is there any way to change this so a Krypton light's "starting point" at [0,0] would spawn at the top left of the screen, and thus follow the standard screen co-ordinates for position? I ask because currently I'm busy working on my game where my spawn point is [512,512]. With hard code, the closest I've got to the light being "central" to this point is the vector position [12,-20], which makes no sense and is impossible to craft, mathematically, if I want the light to move with the camera (the position [480,512] maps roughly to [10,-20]). So, is there any way to "normalise" the krypton lights to use standard screen co-ordinates? If you guys can, play around with the demo from the site and please see if you can find anything out about it. Documentation on the engine is rather scarce, so it's difficult to find anything relevant to my "pixel-perfect" need. It might just also be something in the code with regards to the matrices that I'm not fully understanding. Any help would be useful. Thanks.

    Read the article

  • HLSL What you get when you subtract world position from InvertViewProjection.Transform?

    - by cubrman
    In one of NVIDIA's Vertex shaders (the metal one) I found the following code: // transform object normals, tangents, & binormals to world-space: float4x4 WorldITXf : WorldInverseTranspose < string UIWidget="None"; >; // provide tranform from "view" or "eye" coords back to world-space: float4x4 ViewIXf : ViewInverse < string UIWidget="None"; >; ... float4 Po = float4(IN.Position.xyz,1); // homogeneous location coordinates float4 Pw = mul(Po,WorldXf); // convert to "world" space OUT.WorldView = normalize(ViewIXf[3].xyz - Pw.xyz); The term OUT.WorldView is subsequently used in a Pixel Shader to compute lighting: float3 Ln = normalize(IN.LightVec.xyz); float3 Nn = normalize(IN.WorldNormal); float3 Vn = normalize(IN.WorldView); float3 Hn = normalize(Vn + Ln); float4 litV = lit(dot(Ln,Nn),dot(Hn,Nn),SpecExpon); DiffuseContrib = litV.y * Kd * LightColor + AmbiColor; SpecularContrib = litV.z * LightColor; Can anyone tell me what exactly is WorldView here? And why do they add it to the normal?

    Read the article

  • Oracle BI adminisztráció és dokumentáció

    - by Fekete Zoltán
    Felmerült a kérdés, hogyan lehet telepíteni az Oracle Business Intelligence csomagok (BI EE, BI SE One) adminisztrációs eszközeit? Maga a BI végfelhasználói felület webes, böngészonket használva tudjuk használni az integrált elemeket: - interaktív irányítópultokat (dashboard) - ad-hoc (eseti) elemzések - jelentések, kimutatások, riportok - riasztások, értesítések - vezetett elemzések, folyamatok,... Az adminisztrátori eszközök egy része kliensként telepítendo a windows-os kliens gépekre, azaz a BI EE telepíto készletet windows-os változatában érhetok el. Az Oracle BI dokumentáció itt olvasható és töltheto le, közte az adminisztrációs dokumentum is,

    Read the article

  • E-Business Suite 11.5.10 Fenntartó Támogatással és 12.1 Meghosszabbított Támogatással kapcsolatos külön bejelentés

    - by user552636
    Igaz, az idei Oracle Open World (OOW) már régen volt, de akkor ez a blog még nem létezett. Ugyanakkor azóta többektol kaptam kérdést az OOW-n tett E-Business Suite támogatással kapcsolatos bejelentés értelmezésére vonatkozóan. Ezért gondoltam, hasznos lehet a magyar felhasználók számára, ha írok pár sort a bejelentésrol. Az E-Business Suite (EBS) 11.5.10 verzióhoz kapcsolódó bejelentés: Az Oracle Élettartam Támogatási modellje szerint ez a verzió általánosan 2004. novembertol volt elérheto, melyre az Oracle 2010. november 30-ig biztosított Premier Támogatást, 2010. December 1-tol 2013. november végéig pedig Meghosszabbított Támogatást nyújt. Jövo év december 1-tol az EBS 11.5.10 verzió Fenntartó Támogatás szakaszba kerül. Fenntartó Támogatás szakaszban az újonnan felfedezett bug-okat már nem javítja a Fejlesztés. A bejelentés szerint Oracle a 11.5.10 esetében kivételt tesz és a 2013. december 1-tol 2014. november 30-ig terjedo idoszakban az éles üzemet érinto 1-es súlyossági szintu problémák esetében biztosítani fogja új hibák javítását is. Amire ügyelni érdemes, a rendszer a  Doc ID 883202.1 My Oracle Support dokumentumban részletezett minimum patch szinten kell legyen.   Ez a plusz szolgáltatás nem befolyásolja a támogatási díjat. Az E-Business Suite (EBS) 12.1 verzióhoz kapcsolódó bejelentés   Az EBS 12.1 verzió Meghosszabbított támogatásának eredetileg mehírdetett idoszaka 2014. május 1. – 2017. április 30. volt. Oracle ezt az idoszakot 19 hónappal megtoldotta, így ennek a verziónak a Meghosszabbított Támogatása 2018. december 31-ig tart majd. További jó hír e verziót használó Ügyfeleink számára, hogy a Meghosszabbított Támogatás emelt díjától az Oracle eltekint. Standard Oracle árazás szerint a Meghosszabbított Támogatás elso évében a szolgáltatási díj a Premier Támogatás díjának 110%-a, a második, ill. harmadik években a Premier Támogatás díjának 120%-a. Oracle jelen esetben a plusz 10%-tól, ill. a plusz 20 %-tól eltekint. Íly módon a Meghosszabbított Támogatást Oracle a Premier Támogatás díjért biztosítja majd. Amire ügyelni érdemes, a rendszer a  Doc ID 1195034.1 My Oracle Support dokumentumban részletezett minimum patch szinten kell legyen.   Az idoszakok egyszerubb megértése érdekében az alábbi grafikonon ábrázoltam a szóban forgó verziók támogatásának egyes szakaszait.  

    Read the article

  • How can OpenGL graphics be displayed remotely using VNC?

    - by Jared Brown
    I am attempting to run a program that uses OpenGL to render a model in a viewport through VNC unsuccessfully. The error message I receive is - Xlib: extension "GLX" missing on display ":1.0". It was my understanding that VNC can be configured to render all graphics remotely and send a compressed screen grab from the display buffer to the local client. This would seem to negate the need for GLX extensions on the local client. Can VNC be configured this way and could you briefly describe how? Remote host: vncserver on RHEL 5 Local client: UltraVNC on Windows XP

    Read the article

  • The ship "shudders" in scrolling Asteroids

    - by Ciaran
    In my Asteroids game the user can scroll through space. When scrolling, the ship is drawn in the centre of the window. I use interpolation. I scroll the window uing glOrtho, centering it around the centre of the ship. On my first machine (7 years old, Windows XP, NVIDIA), I am doing 50 updates and 76 frames per second. This is smooth. My other machine an old compaq laptop (Pentium III) with Linux and Radeon OpenGL driver delivers 50 updates and 30 frames per second. The ship regularly seems to "shudder" back and forth when at maximum thrust. When you position the mouse cursor beside the ship it is obvious that its relative position in the window changes. Also, the stars seem blurred into short "lines". Playing the game in non-scrolling mode, the ship moves within the window, glOrtho is therefore not called repeatedly and there is no problem. I suspect a bug in my positioning of the ship and the window but I have dumped out these values and they seem to only go forward, not forward-back-forward. The driver does support double buffering. I guess if it is my bug I need to slow the frame-rate down to debug properly. My question: is this an obvious driver bug or is the slower machine uncovering a bug in my stuff and if so, some debugging tips would be appreciated. I am drawing in world co-ordinates and letting OpenGL do the scaling and translation so if I had a quick way of verifying what pixel co-ordinates OpenGL produces for the ship centre, that would help clarify this.

    Read the article

  • X Error of failed request: BadMatch [migrated]

    - by Andrew Grabko
    I'm trying to execute some "hello world" opengl code: #include <GL/freeglut.h> void displayCall() { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnable(GL_DEPTH_TEST); ... Some more code here glutSwapBuffers(); } int main(int argc, char *argv[]) { glutInit(&argc, argv); glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH); glutInitWindowSize(500, 500); glutInitWindowPosition(300, 200); glutInitContextVersion(4, 2); glutInitContextFlags(GLUT_FORWARD_COMPATIBLE); glutCreateWindow("Hello World!"); glutDisplayFunc(displayCall); glutMainLoop(); return 0; } As a result I get: X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 128 (GLX) Minor opcode of failed request: 34 () Serial number of failed request: 39 Current serial number in output stream: 40 Here is the stack trace: fghCreateNewContext() at freeglut_window.c:737 0x7ffff7bbaa81 fgOpenWindow() at freeglut_window.c:878 0x7ffff7bbb2fb fgCreateWindow() at freeglut_structure.c:106 0x7ffff7bb9d86 glutCreateWindow() at freeglut_window.c:1,183 0x7ffff7bbb4f2 main() at AlphaTest.cpp:51 0x4007df Here is the last piece of code, after witch the program crashes: createContextAttribs = (CreateContextAttribsProc) fghGetProcAddress("glXCreateContextAttribsARB" ); if ( createContextAttribs == NULL ) { fgError( "glXCreateContextAttribsARB not found" ); } context = createContextAttribs( dpy, config, share_list, direct, attributes ); "glXCreateContextAttribsARB" address is obtained successfully, but the program crashes on its invocation. If I specify OpenGL version less than 4.2 in "glutInitContextVersion()" program runs without errors. Here is my glxinfo's OpelGL version: OpenGL version string: 4.2.0 NVIDIA 285.05.09 I would be very appreciate any further ideas.

    Read the article

  • Problems when rendering code on Nvidia GPU

    - by 2am
    I am following OpenGL GLSL cookbook 4.0, I have rendered a tesselated quad, as you see in the screenshot below, and i am moving Y coordinate of every vertex using a time based sin function as given in the code in the book. This program, as you see on the text in the image, runs perfectly on built in Intel HD graphics of my processor, but i have Nvidia GT 555m graphics in my laptop, (which by the way has switchable graphics) when I run the program on the graphic card, the OpenGL shader compilation fails. It fails on following instruction.. pos.y = sin.waveAmp * sin(u); giving error Error C1105 : Cannot call a non-function I know this error is coming on the sin(u) function which you see in the instruction. I am not able to understand why? When i removed sin(u) from the code, the program ran fine on Nvidia card. Its running with sin(u) fine on Intel HD 3000 graphics. Also, if you notice the program is almost unusable with intel HD 3000 graphics, I am getting only 9FPS, which is not enough. Its too much load for intel HD 3000. So, sin(X) function is not defined in the OpenGL specification given by Nvidia drivers or something else??

    Read the article

  • 3d point cloud render from x,y,z 2d array with texture

    - by user1733628
    Need some direction on 3d point cloud display using OpenGL in c++ (vs2008). I am brand new to OpenGL and trying to do a 3d point cloud display with a texture. I have 3 2D arrays (each same size 1024x512) representing x,y,z of each point. I think I am on the right track with glBegin(GL_POLYGON); for(int i=0; i<1024; i++) { for(int j=0; j<512; j++) { glVertex3f(x[i][j], y[i][j], z[i][j]); } } glEnd(); Now this loads all the vertices in the buffer (I think) but from here I am not sure how to proceed. Or I am completely wrong here. Then I have another 2D array (same size) that contains color data (values from 0-255) that I want to use as texture on the 3D point cloud and display. I understand that this maybe a very basic OpenGL implementation for some but for me this is a huge learning curve. So any pointers, nudge or kick in the right direction will be appreciated.

    Read the article

  • Rendering multiple squares fast?

    - by Sam
    so I'm doing my first steps with openGL development on android and I'm kinda stuck at some serious performance issues... What I'm trying to do is render a whole grid of single colored squares on to the screen and I'm getting framerates of ~7FPS. The squares are 9px in size right now with one pixel border in between, so I get a few thousand of them. I have a class "Square" and the Renderer iterates over all Squares every frame and calls the draw() method of each (just the iteration is fast enough, with no openGL code the whole thing runs smootlhy at 60FPS). Right now the draw() method looks like this: // Prepare the square coordinate data GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer); // Set color for drawing the square GLES20.glUniform4fv(mColorHandle, 1, color, 0); // Draw the square GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer); So its actually only 3 openGL calls. Everything else (loading shaders, filling buffers, getting appropriate handles, etc.) is done in the Constructor and things like the Program and the handles are also static attributes. What am I missing here, why is it rendering so slow? I've also tried loading the buffer data into VBOs, but this is actually slower... Maybe I did something wrong though. Any help greatly appreciated! :)

    Read the article

  • Rendering 8 bit graphics

    - by Matjaz Muhic
    I have a strong programming background just not from game development. I only made some pong and snake in high school and I did some OpenGL in college. I want to make my own game engine. Nothing fancy just a simple 2D game engine. But because I'm kinda old school and feeling retro. I want graphics to look like old 8 bit games (megaman, contra, super mario, ...). So how were the old games made back then? I want the simplest approach. Were they also using assets (images) like newer engines now do? How do you achieve this kind of rendering using OpenGL? Keep in mind. Simplest solution. I want to know how it was made back then and how I can replicate that. Doesn't even have to be OpenGL. I can draw on window canvas. I do want to make it from scratch basically.

    Read the article

  • Why do I get a blinking screen when running lwjgl?

    - by SystemNetworks
    I didn't have any errors. But When I run my lwjgl game, it gives me a blinking screen. Here is the code: package L1F3; import org.lwjgl.opengl.Display; import org.lwjgl.opengl.DisplayMode; import org.lwjgl.LWJGLException; import static org.lwjgl.opengl.GL11.*; public class Main { public static void main(String[] args) { try { Display.setDisplayMode(new DisplayMode(640, 480)); Display.setTitle("A fresh display!"); Display.create(); } catch (LWJGLException e) { e.printStackTrace(); Display.destroy(); System.exit(1); } while(!Display.isCloseRequested()) { Display.update(); } Display.destroy(); System.exit(0); } } How do I stop the blinking screen? I was thinking its my framerate. I deleted Display.sync but it still gives me all white and black. Last time it didn't give me a blinking screen. EDIT When I remove Display.update() , it gives me a perfect screen, no blinking or no white. Will my game work without it? I can also close it perfectly.

    Read the article

  • Game Development

    - by Sundareswaran Senthilvel
    I'm planning to write a game from scratch (a BIG Game, for commercial purpose). I'm aware that there are certain compute libraries like OpenCL, AMD APP SDK, C++ AMP as well as DirectCompute - both from MS (NOT interested in CUDA) are available in the market. I'm planning to write the game from the scratch, which includes the following engines... 1.Physics Engine 2.AI Engine 3.Main Game Engine (... and if anything is missed). I'm aware that, there are some free physics engine libraries in the market. Not sure about free AI engine libraries. I'm bit confused in choosing between the OpenCL, AMD APP SDK, and C++ AMP libraries (as already mentioned i'm NOT interested in CUDA). I want my game to be published in Windows/Android/Mac OSX. It means it should be a cross-platform game. I will be having "one source code" that i'll compile for various platforms like Windows/Android/Mac OSX, and any others if i missed. Note: Since I'm NOT a Java guy, kindly do NOT suggest me the Java Language. For Graphics language should i use OpenGL or DirectX 11? I have heard that OpenGL runs on a single core, and not sure of DirectX 11. Between OpenGL and DirectX which one should i follow? or else, are there any other graphics language that i need to start with? I want to make use of the parallelism in GPU as well as CPU.

    Read the article

  • Trouble with onscreen keyboard orientation in iPhone OpenGL ES application.

    - by Plumenator
    I need to take keyboard input in my OpenGL ES application, so I just created an empty UITextView and added it as a subview to the main window along with the view that presents my content. I use the UITextview to control the keyboard and it works fine in a single orientation. I then changed my code to support all orientations by rotating the OpenGL content myself based on UIDeviceOrientation notifications. To rotate the keyboard, I overrode the shouldAutoRotateInterfaceOrientation method in the UITextView's controller. But I still see that the keyboard does not rotate according to the orientation. Any clues?

    Read the article

  • OpenGL ES view: how to orient it to landscape?

    - by Steph Thirion
    Looking for clues about orienting an OpenGL ES app in landscape, most information I found dates back from 2008, most of it refering to the early versions of the SDK. Apparently, back in the days, in the case of GL it was recommended to not rotate the view, but instead to apply the rotation as a GL transformation. Is it still the case with the current SDKs? It would be so much simpler to simply rotate the window: all the touch events would be in sync with the rotation. In other words: how to set up an OpenGL view in landscape mode?

    Read the article

  • iPhone - Drawing 2D with OpenGL ES, fast and simple.

    - by Johannes Jensen
    I'm going to make a game for the iPhone, and I'm mostly going to be using images. I've read that using Quartz only is slow for actual games with high frame rates, so I was wondering if you guys had any good ideas for using OpenGL for rendering a game scene? I'm going to be using a lot of images, and I want to be able to freely rotate them. I've looked at Apple's examples GLSprite and GLPaint, but I don't really see anything I could use. All I want to do is be able to render images at specific positions, and want to be able to rotate them. I'm a noob at OpenGL, but I know Quartz.

    Read the article

  • How should I structure my iPhone OpenGL ES 1.1 game?

    - by Ryan
    I am building an iPhone OpenGL ES 1.1 game. I am using the OpenGL ES template provided by xcode. I am only using the ES1Renderer. I've coded some basic touch actions the user can take, and I've begun to think about the overall structure of the code. All I'm really doing is using C in the ES1Renderer.m for my entire game state. I have an array of bullet structs, an array on enemy structs, etc.. Besides using this structure, where my entire game state is in ES1Renderer.m as C arrays and structs, what other ways are there to structure the code? Should I be using C++ or Objective-C classes to represent the enemies so they are more modular? The main reason I ask this is because I don't normally code in C, C++ and Objective-C..so I am a little fuzzy on coming up with a good architecture here.

    Read the article

  • what is the best approach for to use openGL in the web?

    - by Y_Y
    I wrote a program in C++/OpenGL (using Dev-C++ compiler) for my calculus 2 class. The teacher liked the program and he requested me to somehow put it online so that instead of downloading the .exe file and run it the web browser will run it automatically just like a java applet. The question is: How if possible, can I display a C++/OpenGL program in a web browser? I am thinking of moving to JOGL which is a java interpretation of OpenGL but I rather stay in C++ since I am more familiar with it. Also is there any other better and easier 3D web base API that I can consider?

    Read the article

  • HD Video on Pandaboard ES

    - by Lord Loh.
    I am running Ubuntu 11.10 on my Pandaboard ES. I attempted to playback a 720p / 1080p video. In both cases, the videos were highly jittery and the audio seemed to be plagued with something that seemed to be white noise (sounds like white noise, but it definitely is not white noise). I was using VLC media player for playback and was running XFCE4 as the desktop ewnvironment. The system monitor graph indicates both the CPU cores running ~100%. Lower resolution videos seemed to play without problems. How do I play 1080 / 720 HD videos on pandaboard? Thanks in advance.

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >