Search Results

Search found 1370 results on 55 pages for 'ed gl'.

Page 8/55 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • How are OpenGL ES 1 framebuffers and textures sized?

    - by jens
    I am trying to draw to a texture using a framebuffer using OpenGL ES 1.1 on Android, Java. Afterwords I want to overlay this texture full-screen over my game. In theory, this works like a charm, but somehow the coordinates are off. For testing I drew something at (0,0) with width and height 200, and it partly is off-screen. This is how I create the framebuffer: fb = new int[1]; depthRb = new int[1]; renderTex = new int[1]; gl11ep.glGenFramebuffersOES(1, fb, 0); gl11ep.glGenRenderbuffersOES(1, depthRb, 0); // the depth buffer gl.glGenTextures(1, renderTex, 0);// generate texture gl.glBindTexture(GL10.GL_TEXTURE_2D, renderTex[0]); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_REPEAT); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_REPEAT); texBuffer = ByteBuffer.allocateDirect(buf.length*4).order(ByteOrder.nativeOrder()).asIntBuffer(); gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE, texW, texH, 0, GL10.GL_LUMINANCE, GL10.GL_UNSIGNED_BYTE, texBuffer); gl11ep.glBindRenderbufferOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); gl11ep.glRenderbufferStorageOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, GL11ExtensionPack.GL_DEPTH_COMPONENT16, texW, texH); Before I draw, I do this: gl11ep.glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, fb[0]); gl.glClearColor(0f, 0f, 0f, 0f); // specify texture as color attachment gl11ep.glFramebufferTexture2DOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, GL10.GL_TEXTURE_2D, renderTex[0], 0); // attach render buffer as depth buffer gl11ep.glFramebufferRenderbufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_DEPTH_ATTACHMENT_OES, GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); I set texW = 1024 and texH = 512. When rendering this texture fullscreen, with a lightmask (size 200x200) placed at (0, 0) and (texW/2, texH/2). You can see that it seems like the coordinate system doesnt start at (0,0) as that light overlaps the screen and the images are not drawn as squares (my lightcone-texture is a circle, not an ellipse). So, how is the coordinate system of this offscreen-drawn texture defined? Thanks

    Read the article

  • Multi-threaded JOGL Problem

    - by moeabdol
    I'm writing a simple OpenGL application in Java that implements the Monte Carlo method for estimating the value of PI. The method is pretty easy. Simply, you draw a circle inside a unit square and then plot random points over the scene. Now, for each point that is inside the circle you increment the counter for in points. After determining for all the random points wither they are inside the circle or not you divide the number of in points over the total number of points you have plotted all multiplied by 4 to get an estimation of PI. It goes something like this PI = (inPoints / totalPoints) * 4. This is because mathematically the ratio of a circle's area to a square's area is PI/4, so when we multiply it by 4 we get PI. My problem doesn't lie in the algorithm itself; however, I'm having problems trying to plot the points as they are being generated instead of just plotting everything at once when the program finishes executing. I want to give the application a sense of real-time display where the user would see the points as they are being plotted. I'm a beginner at OpenGL and I'm pretty sure there is a multi-threading feature built into it. Non the less, I tried to manually create my own thread. Each worker thread plots one point at a time. Following is the psudo-code: /* this part of the code exists in display() method in MyCanvas.java which extends GLCanvas and implements GLEventListener */ // main loop for(int i = 0; i < number_of_points; i++){ RandomGenerator random = new RandomGenerator(); float x = random.nextFloat(); float y = random.nextFloat(); Thread pointThread = new Thread(new PointThread(x, y)); } gl.glFlush(); /* this part of the code exists in run() method in PointThread.java which implements Runnable */ void run(){ try{ gl.glPushMatrix(); gl.glBegin(GL2.GL_POINTS); if(pointIsIn) gl.glColor3f(1.0f, 0.0f, 0.0f); // red point else gl.glColor3f(0.0f, 0.0f, 1.0f); // blue point gl.glVertex3f(x, y, 0.0f); // coordinates gl.glEnd(); gl.glPopMatrix(); }catch(Exception e){ } } I'm not sure if my approach to solving this issue is correct. I hope you guys can help me out. Thanks.

    Read the article

  • (Android) How are OpenGL ES 1 framebuffers and textures sized?

    - by jens
    I am trying to draw to a texture using a framebuffer using OpenGL ES 1.1 on Android, Java. Afterwords I want to overlay this texture full-screen over my game. In theory, this works like a charm, but somehow the coordinates are off. For testing I drew something at (0,0) with width and height 200, and it partly is off-screen. This is how I create the framebuffer: fb = new int[1]; depthRb = new int[1]; renderTex = new int[1]; gl11ep.glGenFramebuffersOES(1, fb, 0); gl11ep.glGenRenderbuffersOES(1, depthRb, 0); // the depth buffer gl.glGenTextures(1, renderTex, 0);// generate texture gl.glBindTexture(GL10.GL_TEXTURE_2D, renderTex[0]); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_REPEAT); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_REPEAT); texBuffer = ByteBuffer.allocateDirect(buf.length*4).order(ByteOrder.nativeOrder()).asIntBuffer(); gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE, texW, texH, 0, GL10.GL_LUMINANCE, GL10.GL_UNSIGNED_BYTE, texBuffer); gl11ep.glBindRenderbufferOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); gl11ep.glRenderbufferStorageOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, GL11ExtensionPack.GL_DEPTH_COMPONENT16, texW, texH); Before I draw, I do this: gl11ep.glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, fb[0]); gl.glClearColor(0f, 0f, 0f, 0f); // specify texture as color attachment gl11ep.glFramebufferTexture2DOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, GL10.GL_TEXTURE_2D, renderTex[0], 0); // attach render buffer as depth buffer gl11ep.glFramebufferRenderbufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_DEPTH_ATTACHMENT_OES, GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); I set texW = 1024 and texH = 512. When rendering this texture fullscreen, with a lightmask (size 200x200) placed at (0, 0) and (texW/2, texH/2). You can see that it seems like the coordinate system doesnt start at (0,0) as that light overlaps the screen and the images are not drawn as squares (my lightcone-texture is a circle, not an ellipse). So, how is the coordinate system of this offscreen-drawn texture defined? Thanks

    Read the article

  • How to stop an activity with GL view. Application not responding

    - by David
    I have an activity that I want to programmatically stop after some elapsed time. I have a GL view running. My understanding is that the GLSurfaceView.renderer is running in its own thread, so I created a handler instantiated from the activity. In the GLSurfaceView I post a message. The handler then attempts to stop the activity with the following: onStop(); finish(); The activity seems to close. However, if I try to restart the application, I get a blank screen, the phone is practically locked up and eventually I get a message that the activity in the application is not responding. I'm also getting geting a wake lock, ie PowerManager.newWakeLock(...). When I try to release it, I get an exception. Not sure if its the GL view of the wake lock which is causing me troubles. Any ideas appreciated.

    Read the article

  • How do I see a history of what I've POST-ed in Google Chrome?

    - by Tomas Lycken
    I just submitted a form that included a text box, in which I had written a quite long text. In another textbox, I filled in a date in the wrong format - and instead of getting an error message, the web site just acted as if my form submission was valid, except nothing was saved. Is there any way to see the history of what has been POST-ed (in the current session, at least), from where I can recover my lost text?

    Read the article

  • How do I see a history of what I've POST-ed in Google Chrome?

    - by Tomas Lycken
    I just submitted a form that included a text box, in which I had written a quite long text. In another textbox, I filled in a date in the wrong format - and instead of getting an error message, the web site just acted as if my form submission was valid, except nothing was saved. Is there any way to see the history of what has been POST-ed (in the current session, at least), from where I can recover my lost text?

    Read the article

  • Don’t miss a thing when you going the Mix, DevConnections, Tech-ed or PDC conferences.

    - by albertpascual
    Besides all sessions and courses found in the agenda there are events happening around that you will miss, those events are being published and index in this iPhone & iPad app for you to find the parties or external events around the conference that otherwise you will miss. Download it for free here if you are going to the Mix, DevConnections, TechEd or Pdc this year. http://itunes.apple.com/us/app/eventmeetup/id421597442?mt=8&ls=1 Cheers Al

    Read the article

  • Difference between GL10 and GLES10 on Android

    - by kayahr
    The GLSurfaceView.Renderer interface of the Android SDK gives me a GL interface as parameter which has the type GL10. This interface is implemented by some private internal jni wrapper class. But there is also the class GLES10 where all the GL methods are available as static methods. Is there an important difference between them? So what if I ignore the gl parameter of onDrawFrame and instead use the static methods of GLES10 everywhere? Here is an example. Instead of doing this: void onDrawFrame(GL10 gl) { drawSomething(gl); } void drawSomething(GL10 gl) { gl.glLoadIdentity(); ... } I could do this: void onDrawFrame(GL10 gl) { drawSomething(); } void drawSomething() { GLES10.glLoadIdentity(); ... } The advantage is that I don't have to pass the GL context to all called methods. But even it it works (And it works, I tried it) I wonder if there are any disadvantages and reasons to NOT do it like that.

    Read the article

  • Android OpenGL es "glDrawTexfOES" draws upside down

    - by Alle
    I'm using OpenGL for Android to draw my 2D images. Whenever I draw something using the code: gl.glViewport(aspectRatioOffset, 0, screenWidth, screenHeight); gl.glMatrixMode(GL10.GL_PROJECTION); gl.glLoadIdentity(); GLU.gluOrtho2D(gl, aspectRatioOffset, screenWidth + aspectRatioOffset,screenHeight, 0); gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glLoadIdentity(); gl.glEnable(GL10.GL_TEXTURE_2D); gl.glBindTexture(GL10.GL_TEXTURE_2D, myScene.neededGraphics.get(ID).get(animationID).get(animationIndex)); crop[0] = 0; crop[1] = 0; crop[2] = width; crop[3] = height; ((GL11Ext)gl).glDrawTexfOES(x, y, z, width, height) I get an upside down result. I'v seen people solve this through doing: crop[0] = 0; crop[1] = height; crop[2] = width; crop[3] = -height; This does however hurt the logic in my application, so I would like the result to not be flipped upside down. Does anyone know why it happen, and any way of avoiding or solving it?

    Read the article

  • WebGL transparent black.

    - by Catalin Dumitru
    I have a strange problem that I can't figure out when trying to do blending in WebGL. Black is rendered fully transparent , and everything with shades of grey in it is rendered also semi transparent. I have set it to use the alpha channel as the source for transparency, and in some respect it works, every thing that isn't black/grey is rendered differently when changing the alpha value. but even when I set the alpha to 1, black is still displayed transparent. This is how I enable transparency: this.gl.blendFunc(this.gl.SRC_ALPHA, this.gl.ONE); this.gl.enable(this.gl.BLEND); this.gl.disable(this.gl.DEPTH_TEST); And the part of the shader that does transparency: gl_FragColor = vec4(texColor.rgb * vLightWeight, texColor.a * uAlpha); where texColor is the texture color that is being sampled, vLightWeight is the shadowing that is being calculated in the vertex shader, and uAlpha the uniform which I use for transparency.

    Read the article

  • Is it safe to make GL calls with multiple threads?

    - by user146780
    I was wondering if it was safe to make GL calls with multiple threads. Basically I'm using a GLUtesselator and was wondering if I could divide the objects to draw into 4 and assign a thread to each one. I'm just wondering if this would cause trouble since the tesselator uses callback functions. Can 2 threads run the same callback at the same time as long as that callback does not access ant global variables? Are there also other ways I could optimize OpenGL drawing using multithreading? Thanks

    Read the article

  • Unity stuck in 2D mode, Nvidia Quadro graphics "unknown", Nvidia-Current active but not in use

    - by Jordan Lund
    I've seen this problem reported under several questions, but I haven't been able to resolve any of it so I thought I'd bring it all in under one umbrella. I started a new job and was given a Dell Precision M6400 laptop with Nvidia Quadro FX 2700M graphics card. It had a previous version of Ubuntu on it, but nobody had any passwords for it so I wiped the drive and did a fresh install of 11.10 from scratch. I didn't do any updates during installation, preferring to do them after boot. Updates ran fine and the system works... except Unity is in 2D mode. System Settings - Additional Drivers shows that Nvidia-Current is active but not in use. System Settings - System Info shows Graphics Driver Unknown, Experience Standard Nvidia X Server Settings is installed and working, re-writing the xorg.conf did nothing. /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: Quadro FX 2700M/PCI/SSE2 OpenGL version string: 3.3.0 NVIDIA 285.05.09 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes One suggestion was to do a sudo apt-get --purge remove nvidia* and that resulted in a scrambled screen on boot and a non-bootable installation. Pressing the Delete key on boot allowed me to access the recovery console and do a sudo apt-get install nvidia-current, which brought me back to a working, bootable system. Another suggestion was to edit /etc/default/grub and change the line reading "GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" to read "GRUB_CMDLINE_LINUX_DEFAULT="quiet splash vmalloc=192MB" thus allocating more video RAM. I did that, followed by a sudo update-grub and a re-boot. No change. Created a brand new standard user and logged on with that account, no change.

    Read the article

  • Screen becomes black after pressing dash or alt-tab

    - by cegerxwin
    I did an upgrade from 11.04 to 11.10. Unity 3d becomes a black screen after pressing the dash-button or after pressing alt-tab to switch between open windows. I can see the panel on the top(lock,sound,..) and the panel on the left (launcher) but the rest is black. It looks like a maximised black window. The open Windows are active but I cant see them. I logout by pressing logout in the right top corner and pressing enter (because logout is default focused on the dialogue screen) and leave unity3d. Unity3d worked with 11.04 very good. If I press the dash button the dash looks like an 16-Bit or 8-Bit window and buttons for maximise, minimise and close are displayed and looks inverted. I have rebooted my notebook just now and log in to Unity 3D and tested some features of Unity and everything works well. The black thing is only a layer. I can use my desktop but cant see anything because of the layer, but everything works. It seems so, that a layer appear when pressing dash or alt-tab and does not disappear when close dash or choose a running app with alt-tab. you will see the necessary info related video problems: Unity support: /usr/lib/nux/unity_support_test -p OpenGL vendor string: X.Org R300 Project OpenGL renderer string: Gallium 0.4 on ATI RC410 OpenGL version string: 2.1 Mesa 7.11-devel Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes xorg glxinfo lspci -nn | grep VGA 01:05.0 VGA compatible controller [0300]: ATI Technologies Inc RC410 [Radeon Xpress 200M] [1002:5a62]

    Read the article

  • How to enable Unity 3D support in 12.04 using open-source drivers for RadeonHD cards?

    - by martin
    As the title says I can't enable the Unity 3D support when I'm using open-source drivers (xorg-edgers). I have an xfx Radeon HD 6950 by the way. If I install the proprietary 12.3 drivers from AMD it works, but I get poorer 2D performance than the open-source drivers and also I get some freezes and lock ups at random. So because of this I'm trying the open-source drivers and so far no issues at all, except this one. Running this command $ /usr/lib/nux/unity_support_test -p shows this: OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 0x300) OpenGL version string: 2.1 Mesa 8.0.2 Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no And this command $ lspci -nn | grep VGA shows: 01:00.0 VGA compatible controller [0300]: Advanced Micro Devices [AMD] nee ATI Cayman PRO [Radeon HD 6950] [1002:6719] So, is this normal? Do I need to go back to proprietary drivers to enable Unity 3D? If anyone can give me help, I'll much appreciate it.

    Read the article

  • Ubuntu and bumblebee and intel problem

    - by LnxSlck
    I have a brain teaser that i can't solve. When 12.04 (64bits) came out, i did a fresh install and then installed bumblebee with: sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia And Ubuntu recognized (System Settings - Details) Intel Card for primary card, and i could launch applications with optirun. I have: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520MX] (rev ff) Now, a few days back, i did the exact same thing, installed from the same DVD everything the same, installed bumblebee the same way. Nvidia with Optirun works fine, but Ubuntu doesn't have 3D effects: root@deathstar:~# optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 520MX/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no I never did anything to install Intel drivers (except installing mesa-utils), before bumblebee got everything working, but now i can't get Ubuntu to get Unity with 3D. Can someone help me please get Unity 3d working? Your help will be much appreciated

    Read the article

  • nvidia segfault crashes XServer (12.04 fresh install)

    - by Sébastien GILLES
    I am on a fresh install of 12.04, and my video card is a Quadro FX 570, which appears to be supported (see output of unity_support_test below). But my Xserver crashes randomly and causes my session to just shutdown and bring me back to the logging screen. This happens several times per hour. After checking syslog it that a segfault in the nvidia driver is the cause of the problem: Jun 13 11:14:17 lima kernel: [86569.828982] Xorg[17940]: segfault at ffebeb69 ip b4c8f7aa sp bf8076dc error 4 in nvidia_drv.so[b47a7000+706000] Jun 13 11:14:36 lima gnome-session[18119]: Gdk-WARNING: gnome-session: Fatal IO error 11 (Ressource temporairement non disponible) on X server :3.#012 The funny thing is that as I was reporting this bug, my xserver crashed again (!) and this time I got another error message: Jun 13 11:29:39 lima kernel: [87491.106441] NVRM: Xid (0000:02:00): 6, PE0001 Jun 13 11:29:39 lima gnome-session[26493]: Gdk-WARNING: gnome-session: Fatal IO error 11 (Ressource temporairement non disponible) on X server :4.#012 Any idea as to what can be the problem given that i am on a fresh install with a supported video card ? Sébastien == Output of unity_support_test $ /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: Quadro FX 570/PCIe/SSE2 OpenGL version string: 3.3.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes

    Read the article

  • Laggy Graphics After Upgrade to Ubuntu 13.10

    - by John bracciano
    I noticed some issues concerning the graphics after I upgraded from Ubuntu 13.04 to 13.10. More specifically: Dash tends to be more "laggy" than it used to be. When I switch desktops you can clearly see the screen going frame by frame. Menu bar is also "laggy". When I open GIMP, the menu bar is useless after a while - it takes seconds to refresh. The problems seem to appear after some seconds of use. When I boot the system everything works fine, until some more graphics-intensive program starts (Firefox, GIMP, etc). /usr/lib/nux/unity_support_test -p reports: OpenGL vendor string: Intel Open Source Technology Center OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile OpenGL version string: 3.0 Mesa 9.2.1 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I use an Asus Zenbook UX31A with: Intel® Core i5 Ivy Bridge Integrated Intel® HD Graphics 4000 Intel® HM76 Express Chipset Thank you in advance for any answer!

    Read the article

  • Radeon 9200 Driver?

    - by usuhh86
    I've been using Linux for a while, but haven't really done more than installed programs. I have Ubuntu 12.04 with an ATI Radeon 9200 graphics card. Ubuntu didn't recognize it during the install. All I want is a driver. Preferably the proprietary driver, but I'll settle for an open-source one if I need to. I went to the AMD support website, and whenever I click on the download link for "ATI Proprietary Linux x86 Display Driver 8.28.8" I get redirected to the main AMD website. I tried this on Firefox, Chrome, even booted up my netbook and tried it on IE9. Does anybody have a download link? And if not, a link to download an open-source alternative? Any help will be greatly appreciated. I'm pretty much still a newbie at this. OpenGL vendor string: Tungsten Graphics, Inc. OpenGL renderer string: Mesa DRI R200 (RV280 5961) x86/MMX/SSE2 TCL DRI2 OpenGL version string: 1.3 Mesa 8.0.2 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: no GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: no Unity 3D supported: no

    Read the article

  • Unity 3D doesn't work on ubuntu 12.04 LTS in virtual box hosted by Windows 7 64bit

    - by Mario
    Can't google out the solution through some time so I'm asking here. I have installed Linux Ubuntu 12.04 LTS in virtual box hosted by Windows 7 64bit Simple my 3D stopped working after some kernel headers update<<< few months ago. Just like that. I don't member which version was it. In meantime there was 2 or 3 new releases of VirtualBox which I have installed. Every time I am updating VirtualBox Guest Additions to the newest version. My 3D in Ubuntu still doesn't work. root@pjadmin-VirtualBox:~# /usr/lib/nux/unity_support_test -p OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 0x300) OpenGL version string: 2.1 Mesa 8.0.2 Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no My graphics card in PC is ATI 6850. Please help.

    Read the article

  • Getting Unity 3D working on legacy Nvidia card

    - by user69545
    I installed the latest nVIDIA drivers for my FX5500 card. I understand that the X server version does not officially support this driver or card but was wondering what I can do to get compiz running. I have researched for hours on this issue but cannot come up with an answer for myself. I might be doing all this for nothing but I wanted to at least try. Here is the output of my test: mike@mike-linux-box:~$ /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.35 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no So I was wondering what is the "Not Blacklisted" test? Is this the Nouveau Blacklisting? nVIDIA driver did that automatically. Does this need to be removed? Any help would be appreciated. I just want to run compiz effects. Thanks.

    Read the article

  • Low Graphics and laggy screen after update to 13.10 from 13.04

    - by Wh0RU
    After updating from 13.04 to 13.10 many of 3D unity stuff has stopped working, I am suing Samsung Series 3 (NP350V5X) Laptop which has Switchable Intel and AMD Radeon HD 7670M GFX. xserver-xorg-video-ati does works but NO 3D support and graphics are very low. [I am currently using this] fglrx & fglrx-updates shows blank screen after Login. Intel Graphic Install doesn't work either (dependency error) Output of $ sudo lshw -c video *-display description: VGA compatible controller product: Thames [Radeon HD 7500M/7600M Series] vendor: Advanced Micro Devices, Inc. [AMD/ATI] physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi vga_controller bus_master cap_list rom configuration: driver=radeon latency=0 resources: irq:45 memory:e0000000-efffffff memory:c0120000-c013ffff ioport:3000(size=256) memory:c0100000-c011ffff *-display description: VGA compatible controller product: 3rd Gen Core processor Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:bfc00000-bfffffff memory:d0000000-dfffffff ioport:4000(size=64) Similarly $ /usr/lib/nux/unity_support_test -p OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.3, 256 bits) OpenGL version string: 2.1 Mesa 9.2.1 Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Shows no 3D support. Can anyone please guide how to make things working again.

    Read the article

  • Unity on Ubuntu 11.10 - The Dash Home button brings up the panel, but is empty

    - by David M. Coe
    The dash home button brings up a panel that is greyed out, but it is totally empty. It seems to be the very same issue as this: Dash home button brings up blank window which is unanswered. /usr/lib/nux/unity_support_test -p returns OpenGL vendor string: X.Org R300 Project OpenGL renderer string: Gallium 0.4 on ATI RV370 OpenGL version string: 2.1 Mesa 7.11 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I've tried a unity --reset but that doesn't seem to work. Unity seems to reset, but I get the following warning over and over: cs space validation failed unity What should I do next to try and fix this? Edit: Attempted fixes: I've refomatted, did not work. I've done apt-get remove unity then apt-get update then apt-get install unity, did not work. I've switched to Unity 2d and this seems to work. How can I get regualar Unity working or atleast find the error?

    Read the article

  • Graphics hardware warning when updating to 14.04

    - by pacomet
    As I use Ubuntu at work I just update only LTS versions but now I'm not sure if I can/should. As my computer is now ten years old I would change if was mine but as it is owned by my employer I have to work with it. It's not a bad one, it runs fine (this was not true when still had Windows on it ;-). When updating to 14.04, it warns about possible bad/slow performance with Unity 3D so I stop updating as I am at work, not my own computer. As I understand from http://askubuntu.com/a/438958/25305 Nvidia Geforce FX 5500 graphics card is still supported in 14.04. Now, in 12.04, I have driver version 173 and unity 2d runs fine for me. output of /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.39 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Should I update? Is it better to stay with 12.04? Thanks

    Read the article

  • PyQt QDataWidgetMapper submit doesn't work

    - by DSblizzard
    Here is the parts of code (Mw - main window, Ed - edit dialog, Tv - table view): Mw.mapper = QDataWidgetMapper(Mw) Mw.mapper.setSubmitPolicy(QDataWidgetMapper.ManualSubmit) Mw.mapper.setItemDelegate( QSqlRelationalDelegate(Mw.mapper) ) Mw.mapper.setModel(Table.Model) # Id - first invisible key column Mw.mapper.addMapping(Ed.leName, Name) # Name == 1 Mw.mapper.addMapping(Ed.leSize, Size) # Size == 2 Mw.mapper.toFirst() Row = Tv.currentIndex().row() Mw.mapper.setCurrentIndex(Row) Ed.setModal(1) Ed.show() Res = Mw.mapper.submit() # False Table.Model.submitAll() SQL UPDATE statement works in this case. Why problem with mapper?

    Read the article

  • Tutorial for texture mapping a map onto an Open GL ES sphere?

    - by hotpaw2
    I'm not looking for a library or even open source code. I want to learn how to do this on my own. Where do I start to find an online tutorial, a book chapter, or other educational material for generating a polygonal model of a 3D sphere suitable for feeding to Open GL ES on an iPhone, and then mapping the polygons to some sort of 2D map data so I can texture map the sphere? Is there some sort of software tool (blenders? mayan?) with a tutorial on how to do generate this data? Where is the best place to start?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >