Search Results

Search found 8172 results on 327 pages for 'vector graphics'.

Page 50/327 | < Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >

  • What is going on in this SAT/vector projection code?

    - by ssb
    I'm looking at the example XNA SAT collision code presented here: http://www.xnadevelopment.com/tutorials/rotatedrectanglecollisions/rotatedrectanglecollisions.shtml See the following code: private int GenerateScalar(Vector2 theRectangleCorner, Vector2 theAxis) { //Using the formula for Vector projection. Take the corner being passed in //and project it onto the given Axis float aNumerator = (theRectangleCorner.X * theAxis.X) + (theRectangleCorner.Y * theAxis.Y); float aDenominator = (theAxis.X * theAxis.X) + (theAxis.Y * theAxis.Y); float aDivisionResult = aNumerator / aDenominator; Vector2 aCornerProjected = new Vector2(aDivisionResult * theAxis.X, aDivisionResult * theAxis.Y); //Now that we have our projected Vector, calculate a scalar of that projection //that can be used to more easily do comparisons float aScalar = (theAxis.X * aCornerProjected.X) + (theAxis.Y * aCornerProjected.Y); return (int)aScalar; } I think the problems I'm having with this come mostly from translating physics concepts into data structures. For example, earlier in the code there is a calculation of the axes to be used, and these are stored as Vector2, and they are found by subtracting one point from another, however these points are also stored as Vector2s. So are the axes being stored as slopes in a single Vector2? Next, what exactly does the Vector2 produced by the vector projection code represent? That is, I know it represents the projected vector, but as it pertains to a Vector2, what does this represent? A point on a line? Finally, what does the scalar at the end actually represent? It's fine to tell me that you're getting a scalar value of the projected vector, but none of the information I can find online seems to tell me about a scalar of a vector as it's used in this context. I don't see angles or magnitudes with these vectors so I'm a little disoriented when it comes to thinking in terms of physics. If this final scalar calculation is just a dot product, how is that directly applicable to SAT from here on? Is this what I use to calculate maximum/minimum values for overlap? I guess I'm just having trouble figuring out exactly what the dot product is representing in this particular context. Clearly I'm not quite up to date on my elementary physics, but any explanations would be greatly appreciated.

    Read the article

  • Viewport.Unproject - Checking if a model intersects a large sprite

    - by Fibericon
    Let's say I have a sprite, drawn like this: spriteBatch.Draw(levelCannons[i].texture, levelCannons[i].position, null, alpha, levelCannons[i].rotation, Vector2.Zero, scale, SpriteEffects.None, 0); Picture levelCannon as being a laser beam that goes across the entire screen. I need to see if my 3d model intersects with the screen space inhabited by the sprite. I managed to dig up Viewport.Unproject, but that seems to only be useful when dealing with a single point in 2d space, rather than an area. What can I do in my case?

    Read the article

  • Drawing a texture line between two vectors in XNA WP7

    - by Krav
    I want to create a simple graph maker in WP7. The goal is to draw a texture line between two vectors what the user defines with touch. I already made the rotation, and it is working, but not correctly, because it doesn't calculate the line's texture height, and because of that, there are too many overlapping textures. So it does draw the line, but too many of them. How could I calculate it correctly? Here is the code: public void DrawLine(Vector2 st,Vector2 dest,NodeUnit EdgeParent,NodeUnit EdgeChild) { float d = Vector2.Distance(st, dest); float rotate = (float)(Math.Atan2(st.Y - dest.Y, st.X - dest.X)); direction = new Vector2(((dest.X - st.X) / (float)d), (dest.Y - st.Y) / (float)d); Vector2 _pos = st; World.TheHive.Add(new LineHiveMind(linetexture, _pos, rotate, EdgeParent, EdgeChild,new List<LineUnit>())); for (int i = 0; i < d; i++) { World.TheHive.Last()._lines.Add(new LineUnit(linetexture, _pos, rotate, EdgeParent, EdgeChild)); _pos += direction; } } d is for the Distance of the st (Starting node) and dest (Destination node) rotate is for rotation direction calculates the direction between the starting and the destination node _pos is for starting position changing Thanks for any suggestions/help!

    Read the article

  • Converting Degrees to X and Y Coordinate change

    - by gopgop
    I am using a float positioning system in my game. IE float x,y,z now I want to get the location of the mouse, then to fire an arrow to it. X0 = the players X location X1 = the mouse X location Y0 = the players Y location Y1 = the mouse Y location I want to make a method that the parameters are Degrees and it sets my Yspeed and my Xspeed accordingly to get to the mouse xy starting at player xy How do would I accomplish this?

    Read the article

  • Swapping axis labels between 2D and 3D coordinates

    - by Will
    My game world is 3D. The map is only 2D, however. It is natural to think of the map as having an X and Y axis. And it is natural to think of the world has having an X, Y and Z axis, where Y is upwards. That is to say, X Y in 2D map coordinates is X Z in 3D coordinates. What conventions and approaches do you have to keeping things straight at a code level to make mapping between them natural? (Is Y usually upwards in 3D? Or do you have X and Z in map coordinates, or?)

    Read the article

  • Determining if something is on the right or left side of an object?

    - by meds
    I have a character in a 3D world which is facing an arbitrary direction on a flat plane, the player can click on the left or right side of the character and based on which side is clicked on a different action happens. How can I determine which side the click occured on? Obviously for straight on ahead (0,0,1) I can simply use the x coordinate of the click point to determine if it's the left or right hand side, but what about other cases?

    Read the article

  • How to know if a graphics card provides hardware rendering for wpf

    - by happyclicker
    I have to run a wpf-app in an environment that has all the same dell-pc's with an intel gma 3000 graphics chip (onbard, Q963/Q965). The app renders only with software rendering (Stated so by the RenderCapability.Tier-property (it says the rendering tier is 0!) and I also see this with Perforator). On all of this machines, DirectX 9c is installed and DXDiag states on many but not on all of this machines, that Direct-3d and Direct-Draw-acceleration is activated. I checked also the registry if the setup of these machines disabled wpf-hw rendering but that's also not the case. On one machine I also updated the video-driver and dx with no success. I found a lot of ressources that say, that directX must be installed and active, so that wpf does not use its own software renderer but uses the DirectX HW-Rendering. But on the above machines, DX9c is installed but there is no hw rendering. May it be that wpf uses dx-graphicscards but does the communication with the graphics card direct and not over dx? How can I find out if a specific graphics-chip has to support hardware rendering for wpf or not. The statement that the graphics card must support dx 9c seems not to be the only condition. The second question is, if wpf renders through dx, is this done through direct-3d or is direct-draw used. Is there any good documentation on this topic?

    Read the article

  • Casting Type array to Generic array?

    - by George R
    The short version of the question - why can't I do this? I'm restricted to .NET 3.5. T[] genericArray; // Obviously T should be float! genericArray = new T[3]{ 1.0f, 2.0f, 0.0f }; // Can't do this either, why the hell not genericArray = new float[3]{ 1.0f, 2.0f, 0.0f }; Longer version - I'm working with the Unity engine here, although that's not important. What is - I'm trying to throw conversion between its fixed Vector2 (2 floats) and Vector3 (3 floats) and my generic Vector< class. I can't cast types directly to a generic array. using UnityEngine; public struct Vector { private readonly T[] _axes; #region Constructors public Vector(int axisCount) { this._axes = new T[axisCount]; } public Vector(T x, T y) { this._axes = new T[2] { x, y }; } public Vector(T x, T y, T z) { this._axes = new T[3]{x, y, z}; } public Vector(Vector2 vector2) { // This doesn't work this._axes = new T[2] { vector2.x, vector2.y }; } public Vector(Vector3 vector3) { // Nor does this this._axes = new T[3] { vector3.x, vector3.y, vector3.z }; } #endregion #region Properties public T this[int i] { get { return _axes[i]; } set { _axes[i] = value; } } public T X { get { return _axes[0];} set { _axes[0] = value; } } public T Y { get { return _axes[1]; } set { _axes[1] = value; } } public T Z { get { return this._axes.Length (Vector2 vector2) { Vector vector = new Vector(vector2); return vector; } public static explicit operator Vector(Vector3 vector3) { Vector vector = new Vector(vector3); return vector; } #endregion }

    Read the article

  • Kernel compile error with iw_ndis.c

    - by James
    Hi, I have a hp pavilion dm3t with intel HD graphics running ubuntu 10.10 64 bit. I'm trying to compile and install a patched kernel according to this, https://launchpad.net/~kamalmostafa/+archive/linux-kamal-mjgbacklight So I downloaded the tarball from here (linked to from the page above): http://kernel.ubuntu.com/git?p=kamal/ubuntu-maverick.git;a=shortlog;h=refs/heads/mjg-backlight I untar'd it to a directory, entered the directory and did: make defconfig I'm not sure if that's what I should have done but it was successful, so I did: make which seemed to work fine until it gave these errors: ubuntu/ndiswrapper/iw_ndis.c:1966: error: unknown field ‘num_private’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1966: warning: initialization makes pointer from integer without a cast ubuntu/ndiswrapper/iw_ndis.c:1967: error: unknown field ‘num_private_args’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1967: warning: excess elements in struct initializer ubuntu/ndiswrapper/iw_ndis.c:1967: warning: (near initialization for ‘ndis_handler_def’) ubuntu/ndiswrapper/iw_ndis.c:1970: error: unknown field ‘private’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1970: warning: initialization makes integer from pointer without a cast ubuntu/ndiswrapper/iw_ndis.c:1970: error: initializer element is not computable at load time ubuntu/ndiswrapper/iw_ndis.c:1970: error: (near initialization for ‘ndis_handler_def.num_standard’) ubuntu/ndiswrapper/iw_ndis.c:1971: error: unknown field ‘private_args’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1971: warning: initialization from incompatible pointer type make[2]: *** [ubuntu/ndiswrapper/iw_ndis.o] Error 1 make[1]: *** [ubuntu/ndiswrapper] Error 2 make: *** [ubuntu] Error 2 How can I compile and install this kernel successfully? I'm new to this and would appreciate any help.

    Read the article

  • Ubuntu 12.04, xbmc, opengl, intel motherboard

    - by Sean Hagen
    I've got an HTPC that I built myself, with a Asus P5G41T-M Motherboard. It's got an on-board HDMI port, and I've been using that with no problems. I started out with Mythbuntu ( an older version ), and recently updated to 12.04.1 LTS without any issues. I've been thinking about trying out XBMC for a while, and I decided to give it a go. Unfortunately, I seem to be running into quite a few issues. I got XBMC installed from the repos without any issues, but when I try to run it from a console, a box pops up with the following: XBMC needs hardware accelerated OpenGL rendering. Install an appropriate graphics driver. Please consule XBMC Wiki for supported hardware http://wiki.xbmc.org/?title=Supported_hardware In the console, it prints out the following: X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 136 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 12 Current serial number in output stream: 12 When I run vainfo, I get this: libva: VA-API version 0.32.0 libva: va_getDriverName() returns 0 libva: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so libva: va_openDriver() returns 0 vainfo: VA-API version: 0.32 (libva 1.0.15) vainfo: Driver version: Intel i965 driver - 1.0.15 vainfo: Supported profile and entrypoints VAProfileMPEG2Simple : VAEntrypointVLD VAProfileMPEG2Main : VAEntrypointVLD The file /usr/lib/x86_64-linux-gnu/dri/i964_drv_video.so exists: # ls -l /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so -rw-r--r-- 1 root root 628728 Mar 29 2012 /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so And in /var/log/Xorg.0.log the following error pops up: GLX error: Can not get required symbols. I'm not really sure where to go from here. I've tried searching all over for how to fix this problem. I've done "apt-get --reinstall xserver-xorg" ( as well as a few other video driver packages ) a few times, and no change. Any help in getting this issue sorted out would be awesome.

    Read the article

  • Ubuntu 12.04.2 won't boot after bumblebee instalation

    - by Andrej
    First of all sorry for my English, it's not my first language. Here is what I have done: I had a working ubuntu 12.04 with all updates and working bumblebee, so I could do optirun command and battery life was better than without bumblebee. Than I decided to reinstall both my systems installed windows 7 and ubuntu. Reinstalled Windows 7 all working as expected, than on other partition installed ubuntu 12.04. All worked perfectly. Than I installed bumblebee according to the procedure written here https://wiki.ubuntu.com/Bumblebee same steps that I used before. But now after I install drivers and do all written in procedure and I reboot my notebook system won't boot, it is simply stuck at black screen after short showing of start screen. I reinstalled ubuntu many times already and tried everthing, but when I try install nvidia drivers it won't boot after shutting down notebook and only thing I can do is reinstalling system. I have Lenovo Thinkpad Edge E530 and processor: Intel® Core™ i5-3210M CPU and graphic cards are Intel HD 4000 and Nvidia geforce gt630m After clean install without bumblebee, terminal command lspci| grep VGA is showing: 00:02.0 VGA compatible controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation Device 0de9 (rev a1) Can you suggest a sollution?? Or at least some links to similar topics??

    Read the article

  • OpenGL: Attempt to allocate a texture to big for the current hardware

    - by AnonymousMan
    I'm getting the following error: java.io.IOException: Attempt to allocate a texture to big for the current hardware at org.newdawn.slick.opengl.InternalTextureLoader.getTexture(InternalTextureLoader.java:320) at org.newdawn.slick.opengl.InternalTextureLoader.getTexture(InternalTextureLoader.java:254) at org.newdawn.slick.opengl.InternalTextureLoader.getTexture(InternalTextureLoader.java:200) at org.newdawn.slick.opengl.TextureLoader.getTexture(TextureLoader.java:64) at org.newdawn.slick.opengl.TextureLoader.getTexture(TextureLoader.java:24) The image I'm trying to use is 128x128. System.out.println(GL11.glGetInteger(GL11.GL_MAX_TEXTURE_SIZE)); I get: 32. 32??!! My graphics card is AMD Radeon HD 7970M with 2048 MB GDDR5 RAM, I can run all the latest games in 1080p and 60fps with no problem, and those textures sure as hell doesn't look like they are 32x32 pixels to me! How can I fix this? -- Edit: Here's the chaos code I use to init OpenGL: Display.setDisplayMode(new DisplayMode(500,500)); Display.create(); if (!GLContext.getCapabilities().OpenGL11) { throw new Exception("OpenGL 1.1 not supported."); } Display.setTitle("Game"); glMatrixMode(GL_PROJECTION); glLoadIdentity(); GLU.gluPerspective(45, 1, 0.1f, 5000); Mouse.setGrabbed(true); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glEnable(GL_TEXTURE_2D); glClearColor(0, 0, 0, 0); glEnable(GL_DEPTH_TEST); glDepthFunc(GL_LEQUAL); glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glEnable(GL_BLEND); glEnable(GL_POINT_SMOOTH); glEnable(GL_LINE_SMOOTH); glEnable(GL_POLYGON_SMOOTH); glEnable(GL_POLYGON_OFFSET_FILL); glShadeModel(GL_SMOOTH); Display is a LWJGL thing, it makes the OpenGL context and the window. Anyway, I don't think there's anything in the init code that can help me but you never know...

    Read the article

  • How do I create a 2.5d parallax effect?

    - by Nikolay Dyankov
    I have a decent background in 3D graphics and programming, but I'm new to game development. I'm currently exploring different possibilities and I really want to make an RPG game. I was thinking about classic 2D isometric view, but I really love how Diablo 2 looks and feels to play. My question is - how can I achieve Diablo 2's parallax effect? Everything looks hand drawn with baked lights and shadows and looks awesome, but when you move around you notice some perspective. For example, let's say that I drew a big hall with columns in Photoshop with an orthographic perspective (classic pixel art style, just parallel lines). How would I give parallax effect to this scene when the character moves around? If I use camera-facing sprites for everything it would probably look OK in the distance, but it would be really fake when a character comes close to a column (cylinder) for example. Any suggestions? How did Blizzard make the parallax effect in Diablo 2? See this screenshot: http://guidesmedia.ign.com/guides/10629/images/act2tombs.jpg

    Read the article

  • Algorithmically generating neon layers on pixel grid

    - by user190929
    In an attempt at a screensaver I am making, I am a fan of neo-like graphics, which, of course, look great against a black background. As I understand it, neon, graphically speaking, is essentially a gradient of a color, brightest in the center, and gets darker proceeding outward. Although, more accurate is similar, but separating it into tubes and glow. The tubes are mostly white, while the glow is where most of the color is seen. Well... the tubes could also be a light variant of the color, you could say. The glow is darker. Anyhow, my question is, how could you generate such things given an initial pattern of pixels that would be the tubes? For example, let's say I want to make a neon 'H'. I, via the libraries, can attain the rectangles of pixels which represent it, but I want to make it look neonized. How could I algorithmically achieve such an effect given a base tube shape and base color? EDIT: ok, I mistated that. Got a bit distracted. My purpose for this was similar to a neon effect, but not. Sorry about that. What I am looking for is something like this: Start with a pattern of pixels: [!][!][!][!][!][!][!][!] [!][!][O][!][!][!][!][!] [!][!][O][O][!][!][!][!] [!][!][!][!][O][!][!][!] [!][!][!][!][!][!][!][!] How to I find the U pixels? [!][E][E][E][!][!][!][!] [!][E][O][E][E][!][!][!] [!][E][O][O][E][E][!][!] [!][E][E][E][O][E][!][!] [!][!][!][E][E][E][!][!] Sorry if that looks bad.

    Read the article

  • Ubuntu won't boot after initializing the Build Environment for Android

    - by EntryLevelDev
    My laptop worked perfectly util I initialized the build environment for Android. The GUI won't start. It looks like some kinds of graphics card problems. I tried to fix it but after trying a lot of solutions on the internet nothing worked. (I only know basic linux stuffs.) I've already reinstalled the OS. However, I still want to build the Android from source. Any idea what might cause the problem? any workaround? Here is the command that I used to initialize the build environment: $ sudo apt-get install git gnupg flex bison gperf build-essential \ zip curl libc6-dev libncurses5-dev:i386 x11proto-core-dev \ libx11-dev:i386 libreadline6-dev:i386 libgl1-mesa-glx:i386 \ libgl1-mesa-dev g++-multilib mingw32 tofrodos \ python-markdown libxml2-utils xsltproc zlib1g-dev:i386 $ sudo ln -s /usr/lib/i386-linux-gnu/mesa/libGL.so.1 /usr/lib/i386-linux-gnu/libGL.so My laptop model is asus u36sd. (https://help.ubuntu.com/community/Asus_U36SD) Thanks Edit: Base on this, I guess libgl1-mesa-glx:i386 might cause the issue. sudo apt-get install libgl1-mesa-dri:i386 The following packages will be REMOVED: libgl1-mesa-dri-lts-quantal libxatracker1-lts-quantal ubuntu-desktop xorg xserver-xorg-lts-quantal xserver-xorg-video-all-lts-quantal xserver-xorg-video-vmware-lts-quantal The following NEW packages will be installed: libdrm-intel1:i386 libdrm-nouveau1a:i386 libdrm-radeon1:i386 libdrm2:i386 libexpat1:i386 libffi6:i386 libgl1-mesa-dri:i386 libllvm3.0:i386 libpciaccess0:i386 libstdc++6:i386

    Read the article

  • Drawing particles as a smooth blob

    - by Nömmik
    I'm new to game/graphics development and I'm playing around with particles (in 2D). I want to draw particles close to each other as a blob, just as liquid/water. I do not want to draw big circles overlapping as the blob won't be smooth (and too big). I don't really know physics but I assume what I want is something looking similar to surface tension. I haven't been able to find anything on stackexchange or on Google (maybe I do not know the correct keywords?). So far I have found two possible solutions, but I am unable to find any concrete information about algorithms. One of them is to calculate the concave hull of particles I consider being a blob. I can calculate the blob by creating an equivalence class (on the relation "close to each other"). Strangely enough I haven't been able to find any algorithm explaining how to calculate the concave hull. Many posts (and among stackexchange) links to libraries or commercial products that do this (I need libraries to work in C#), but never any algorithm. Also this solution might have a problem with a circle of particles, which would not detect the empty space in the middle. While researching concave hull I stumbled upon something called alpha shapes. Which seems to be exactly what I want to do, however just as with concave hull I haven't found any source explaining how they actually work. I have found some presentation materials but not enough to go on. It's like a big secret everyone knows except me :-/ After calculating the concave hull or alpha shape I want to make it a Bézier curve to make it smooth and nice. Although I do find my approach a bit too complex, maybe I am trying to solve this the wrong way? If you can either suggest any other solution to my problem, or explain the pieces I am missing I would be very happy and grateful :-) Thanks.

    Read the article

  • Making a 2D game with responsive resolution

    - by alexandervrs
    I am making a 2D game, however I wish for it to be resolution agnostic. My target resolution i.e. where things look as intended is 1600 x 900. My ideas are: Make the HUD stay fixed to the sides no matter what resolution, use different size for HUD graphics under a certain resolution and another under a certain large one. Use large HD PNG sprites/backgrounds which are a power of 2, so they scale nicely. No vectors. Use the player's native resolution. Scale the game area (not the HUD) to fit (resulting zooming in some and cropping the game area sides if necessary for widescreen, no stretch), but always fill the screen. Have a min and max resolution limit for small and very large displays where you will just change the resolution(?) or scale up/down to fit. What I am a bit confused though is what math formula I would use to scale the game area correctly based on the resolution no matter the aspect ratio, fully fit in a square screen and with some clip to the sides for widescreen. Pseudocode would help as well. :)

    Read the article

  • ASUS EPC 1015CX won't boot with display after installing additional drivers

    - by Dinesh
    I have a ASUS EPC 1015CX with Ubuntu 12.04 LTS installed using wubi alongside Windows 7. It worked perfectly after installing except that the supported screen resolution was only 800*600 instead of 1024*600. The specifications of the 1015CX device can be found here I tried installing additional drivers which was Broadcom for wireless and Cedarview-drm. By this time there was no display coming up after the boot. I could hear a voice and then the letters I typed got pronounced correctly. So, I guessed the issue has to be with the graphic terminal. I tried repair broken packages in the menu and booted and the display is now at 1024*600. But, if I restart normally the problem continues like before. As per what I found out after searching, I again opened additional drivers and installed Cedarview-graphics-driver. I tried restarting hoping it will work. But again, the result was same. There was no display but the system was working. I did more searching and followed the instructions found here. But still the result was same. I am keen to switch to linux from windows after hearing and learning great things about it. I know this community is good and always helping. I wanted to thank you all for the help in advance. Please let me know what I am doing wrong and help me to switch to linux from this same old windows.

    Read the article

  • No GUI boot; startx error, I suspect no filesystem corruption.

    - by Dharmaj Soni
    Till yesterday, my Ubuntu 9.10 was working fine. I had watched a movie using vlc. I had also charged my ipod using the laptop. Today, when I started it, I automatically booted into command line. There seems to be no filesystem corruption etc as I can view/open (text) files. Before the CLI appeared, the screen blinked with a cursor, then the white Ubuntu logo flashed, and then I got the CLI login prompt. After logging in, if I try startx, to start gnome, I get the following error after a few seconds: giving up xinit: No such file or directory (errno 2): unable to connect to X server xinit: No such process (errno 3): Server error* The same error comes up, even if I use sudo, or if I change my directory to '/' before using startx, and also when, from the grub, using the recovery mode option to load into CLI, and then trying startx. On trying command 'xinit', I get "Server error" Also, on trying GDM, I get 2 errors. I cannot connect to the internet in this state. Thanks for any help. I am using Dell Inspiron 1440, no special graphics card.

    Read the article

  • Unity does not load when selecting Xorg open source radeon driver

    - by Teddy Thorpe
    When I select the X.org Radeon open source driver, Unity does not load. However, when I select the Proprietary AMD Driver from the official Ubuntu repository, Unity does load. Why is Unity doing this? I had no problems with switching between these drivers in Ubuntu 12.04 LTS. This started happening when I upgraded to Ubuntu 12.10. Before I ever installed the propriatery AMD driver, the X.org driver loaded Unity fine, but very sluggishly. I downloaded the AMD driver from their official website and installed that first and that is what started this problem. I removed that driver and went back to X.org and X.org didn't load Unity either. Then using Synaptic, I installed the AMD driver from the Ubuntu repositories and thats what I'm using now. I'm very confused why that downloaded AMD driver would effect X.org. I have a AMD Radeon HD 6620G. It's integrated graphics on my AMD A8-3500M APU (Accelerated Processing Unit as AMD advertises it).

    Read the article

  • Compiling kernal problem

    - by James
    Hi, I have a hp pavilion dm3t with intel HD graphics running ubuntu 10.10 64 bit. I'm trying to compile and install a patched kernel according to this, https://launchpad.net/~kamalmostafa/+archive/linux-kamal-mjgbacklight So I downloaded the tarball from here (linked to from the page above): http://kernel.ubuntu.com/git?p=kamal/ubuntu-maverick.git;a=shortlog;h=refs/heads/mjg-backlight I untar'd it to a directory, entered the directory and did: make defconfig which was successful, so I did: make which seemed to work fine until it gave these errors: ubuntu/ndiswrapper/iw_ndis.c:1966: error: unknown field ‘num_private’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1966: warning: initialization makes pointer from integer without a cast ubuntu/ndiswrapper/iw_ndis.c:1967: error: unknown field ‘num_private_args’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1967: warning: excess elements in struct initializer ubuntu/ndiswrapper/iw_ndis.c:1967: warning: (near initialization for ‘ndis_handler_def’) ubuntu/ndiswrapper/iw_ndis.c:1970: error: unknown field ‘private’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1970: warning: initialization makes integer from pointer without a cast ubuntu/ndiswrapper/iw_ndis.c:1970: error: initializer element is not computable at load time ubuntu/ndiswrapper/iw_ndis.c:1970: error: (near initialization for ‘ndis_handler_def.num_standard’) ubuntu/ndiswrapper/iw_ndis.c:1971: error: unknown field ‘private_args’ specified in initializer ubuntu/ndiswrapper/iw_ndis.c:1971: warning: initialization from incompatible pointer type make[2]: *** [ubuntu/ndiswrapper/iw_ndis.o] Error 1 make[1]: *** [ubuntu/ndiswrapper] Error 2 make: *** [ubuntu] Error 2 How can I compile and install this kernel successfully? I'm new to this and would appreciate any help.

    Read the article

  • Ubuntu Gnome 14.04 Instal Nvidia drivers

    - by user3524668
    I have been looking all over the web, and tried every suggestion I found to get the nvidia drivers working on my computer with Ubuntu Gnome 14.04, with no luck. Every time I install a driver or choose to use the driver from addition drivers, when I reebot, I cannot log in again. It gets stuck at the logo screen. I need to go to the ALT + CTRL + F1 to purge all nvidia traces so I can get back. Is it possible to install the new nvidia drivers? I just upgraded from 13.10 to 14.04, I have an Asus N550VJ which has hybrid graphics with Intel 4000 / Nvidia 750M What Im looking for is to try the primus functionality to disable the discreet card and enable it whenever I want to play or run heavy graphic stuff. When I was in 13.10 I was using bumblebee, but, since, the nvidia prime is supposedly mature enough, I wanted to try it. Is this possible for Ubuntu Gnome 14.04. I read there was a bug with gdm, but also, saw that it was already fixed. Thank you very much for your help. Im not that well versed in linux.

    Read the article

  • Making a game with responsive resolution

    - by alexandervrs
    I am making a game, however I wish for it to be resolution agnostic. My target resolution i.e. where things look as intended is 1600 x 900. My ideas are: Make the HUD stay fixed to the sides no matter what resolution, use different size for HUD graphics under a certain resolution and another under a certain large one. Use large HD sprites/backgrounds which are a power of 2, so they scale nicely. Use the player's native resolution. Scale the game area (not the HUD) to fit (resulting zooming in some and cropping the game area sides if necessary for widescreen, no stretch), but always fill the screen. Have a min and max resolution limit for small and very large displays where you will just change the resolution(?) or scale up/down to fit. What I am a bit confused though is what math formula I would use to scale the game area correctly based on the resolution no matter the aspect ratio, fully fit in a square screen and with some clip to the sides for widescreen. Pseudocode would help as well. :)

    Read the article

  • Intel graphic chipset and NVIDIA Geforce GTX560

    - by antoine
    I have an NVIDIA Geforce GTX560 with two video projectors and I would like to use the onboard Intel Graphic Chipset to plug an additional monitor. I saw the question : How can I use both Intel onboard and Nvidia graphics at the same time? but the answer is so short that I was not convinced. My motherboard (GIGABYTE GA-H61M-D2P-B3 (rev. 1.0)) equipped with Intel H61 Chipset allow shared memory between onboard and PCIe cards. And Windows 7 allow me to use the three outputs thanks to Intel's driver. I'm able to use the onboard graphic card but without graphical interface for now. I think i need intel driver for that. But I would like to know if I can setup my displays in xorg.conf with something like : Section "Device" Identifier "Device0" Driver "intel" EndSection Section "Device" Identifier "Device1" Driver "nvidia" EndSection Section "Device" Identifier "Device2" Driver "nvidia" EndSection Does anyone have successfully setup something like that ? Or should I burn my head experimenting it by myself ? Or is there any good reasons to discouraged me to try ? Thanks for your help. Antoine PS : i'm using Ubuntu 10.10 for now, but I could switch to another version. PS2 : i also read this : Use 3 monitors w/built-in intel adapter + two old nvidia PCI cards on 10.10? which doesn't tell me more about the possibilities to use Intel Graphic and Nvidia at the same time EDIT : according to that : Can not get Dual Monitors to work on Different GPUs, I should be able to run two Xserver one on Intel the other on Nvidia. I will try and post the result here.

    Read the article

  • How to get Unity working on dual gpu laptop?

    - by Mourgos
    I recently bought a new laptop (Asus X53S series) that has two GPUs, an NVidia GeForce GT 540M and an integrated Intel GPU (I believe it's called Intel HD graphics 3000). I installed the recommended restricted NVidia drivers after a clean Ubuntu 11.10 install. In the 'additional drivers' program I get the message: "This driver is enabled and in use", although when I try to open the NVidia X Server Settings it says "You do not appear to be using the NVIDIA X driver." which seems to be the case since Ubuntu only starts using Unity 2D. I've had the same issue in 11.04 and I was forced to use the nouveau driver just to get unity working, but since I get quite a few crashes with it I really want to get the propriety driver working this time around. Since I've never had this issue with older laptops, I can only assume it is caused due to the dual gpu configuration. How do I get Ubuntu to use the propriety drivers, or is there any workaround to get the integrated Intel GPU to be ignored by Ubuntu? Alternatively, has anyone got Unity working with a similar setup?

    Read the article

< Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >