Search Results

Search found 6703 results on 269 pages for 'amd graphics'.

Page 27/269 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • How can I replicate the look and limitations of the Super NES?

    - by Mikalichov
    I am looking to produce graphics with the same limitations / look that in the Super Nes era. I am specifically looking for graphics similar to Chrono Trigger / FF6. It would be a lot easier to do if I had an idea of the resolution / dpi I am supposed to use. I found that the technical specs for the SNES are: Progressive: 256 × 224, 512 × 224, 256 × 239, 512 × 239 Interlaced: 512 × 448, 512 × 478 But even by using these resolutions, it is pointless if I set it at 72dpi, as I will still have possibly very detailed graphics (that is the main thing, I don't want detailed graphics, I want to go pixelated). I figured it might be related to the sprite size limit, i.e.: Sprites can be 8 × 8, 16 × 16, 32 × 32, or 64 × 64 pixels, each using one of eight 16-color palettes and tiles from one of two blocks of 256 in VRAM. Up to 32 sprites and 34 8 × 8 sprite tiles may appear on any one line. This would work for sprites (characters, objects), but what about maps? Are they built entirely from 8x8 tiles? And then, at what resolution is the end result displayed? It might seem like I am giving the question and answers at the same time, but all of these are suppositions I am making, so could someone confirm or correct them?

    Read the article

  • Kubuntu never gets to the login screen

    - by searchfgold6789
    I am not sure if Xorg is using the right device... The system in question is an A4-1400 CPU with built-in Radeon Graphics. I added a PCIe Radeon HD 5750, and changed the BIOS so it would use that card automatically. Now it appears that I can see output from the TV plugged into the HDMI port, and boot into recovery mode, but cannot get to a desktop - X doesn't start. It just freezes at the Kubuntu splash screen. I have no idea how to get X to use the proper graphics card. I pasted the Xorg.log to http://paste.ubuntu.com/6261137/. And the Xorg.conf generated by aticonfig --initial is at http://paste.ubuntu.com/6261161/ Also, see lspci and lshw -c display outputs. I can provide more information, but something does tell me X is not using the right graphics card and I've no idea where to begin how to fix it... I faced a similar issue before with another distribution but found no information anywhere on how to make X use a specific graphics card. Or maybe it's not that simple.

    Read the article

  • Day 5 - Tada! My Game Menu Screen Graphics

    - by dapostolov
    So, tonight I took some time to mash up some graphics for my game menu screen. My artistic talent sucks...but here goes nothing...voila, my menu screen!! The Menu Screen The screen above is displaying 4 sprites, even though it looks like maybe 7... I guess one of the first things for me to test in the future is ... is it more memory efficient (and better frame rate) to draw one big background image OR tp paint the screen black, and place each sprite in set locations? To display the 4 sprites above, I borrowed my code from yesterday ... I know, tacky, but...I wanted to see it, feel it. Do you feel it? FEEL IT! (homer voice & shakes fist) Note: the menu items won't scale properly as it stands with this code, well pretty much they do nothing except look pretty... Paint.Net & Google Fun So how did I create that image above? Well, to create the background and 3 menu items I used Paint.Net. Basically, I scoured Google images for: a stone doorway, a stone pillar, an old book, a wizards hat, and...that's it pretty much it! I'll let you type in those searches and see if you can locate the images I used. I know, bad developer...but I figured since I modified the images considerably it doesn't count...well for a personal project it shouldn't count...*shrug* Anyhow, I extracted each key assest I wanted from each image and applied lots of matting, blurring, color changes, glow effects and such. Then, using my vivid imagination I placed / composed each of the layered assets into the mashed up the "scene" above. Pretty cool, eh? Hey, did you know, the cool mist effect is actually a fire rendition in Paint.net? I set it to black & white with opacity set next to nothing. I'm also very proud of the yellow "light" in the stone doorway. I drew that in and then applied gausian blur to it to give it the effect of light creeping out around the door and into the room...heheh. So did I achieve the dark, mysterious ritual as I stated in my design doc? I think I had a great stab at it! Maybe down the road I can get a real artist to crank out some quality graphics for the game... =) So, What's Next? Well, I don't have that animated brazier yet...however, I thought it would be even cooler if I can get that door pulsing that yellow light and it would be extremely cool to have the smoke / mist moving across the screen! Make the creative ideas stop!! (clutches head) haha! I'm having great fun working on this project =) I recommend others giving something like this a try, it's really fulfilling. OK. Tomorrow... I think I'm going to start creating some game / menu objects as per the design doc, maybe even get a custom mouse cursor up on the screen and handle a couple of mouse events, and lastly, maybe a feature to toggle a framerate display... D.

    Read the article

  • Acer Aspire One 725 - missing graphic card driver?

    - by Melon
    Recently I bought an Acer Aspire One 725 Netbook and installed Ubuntu 12.10 on it. I bought it, because it can run HD movies and has Full HD on external VGA port. However, movies from youtube have a really slow framerate. If you open three tabs in Opera (for example g-mail, youtube and askubuntu) it gets really laggy. My suspicion is that the driver for graphic card is missing. When I check the System->Details->Graphics the driver is unknown. After running lspci | grep VGA I get this output: 00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Device 980a From what I see, I have a AMD C70 processor integrated with (or something similar) AMD Radeon HD 6290. Has anyone had the same problem? Do you know which drivers need to be installed for the graphics to work properly? On official Acer page there are only drivers for Win7 and Win8... Update: I have tried installing fglrx but I get the following error (either I don't have libraries or someone didn't make a clean build before release ;) /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c: In function ‘KCL_MEM_AllocLinearAddrInterval’: /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:2124:5: error: implicit declaration of function ‘do_mmap’ [-Werror=implicit-function-declaration] /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:2124:13: warning: cast to pointer from integer of different size [-Wint-to-pointer-cast] /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c: In function ‘kasInitExecutionLevels’: /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: error: ‘cpu_possible_map’ undeclared (first use in this function) /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: note: each undeclared identifier is reported only once for each function it appears in /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: warning: left-hand operand of comma expression has no effect [-Wunused-value] Update 2: After fixing the erros in compilation, ubuntu acts bizarre and unstable (no left icon panel, no upper panel, cannot run any programs, I only see desktop)

    Read the article

  • Low graphic mode after switching to fglrx drivers

    - by MrKenkadze27
    I have another problem on my laptop after trying to fix another issue. So, because of this issue, I wanted to switch to fglrx to fix it and after restart, I got this screen: I went back to terminal and I got rid of this problem by "purge"-ing the fglrx driver and removing it, going back to problem number one. I tried lot of methods to fix this such as this ones but they ether switched back to open-source drivers or didn't helped at all. So would anyone like to help? maybe give some commends to try? My Laptop has 2 GPU, AMD Radeon HD 7650M and Intel(R) HD Graphics 4000. AMD is running always making my laptop too hot. Here's one of the Xorg.5.log file paste, I am sure it will be useful finding my problem. Thanks! Please make answer easy to understand as I am not an expert, this problem is keeping me from being one. Also the AMD driver which can be downloaded from their site doesn't install, it says non compatible graphics card but Ubuntu Software Updater sure installs it.

    Read the article

  • Acer Aspire One 725 - missing graphic card driver for Radeon HD 6290?

    - by Melon
    Recently I bought an Acer Aspire One 725 Netbook and installed Ubuntu 12.10 on it. I bought it, because it can run HD movies and has Full HD on external VGA port. However, movies from youtube have a really slow framerate. If you open three tabs in Opera (for example g-mail, youtube and askubuntu) it gets really laggy. My suspicion is that the driver for graphic card is missing. When I check the System->Details->Graphics the driver is unknown. After running lspci | grep VGA I get this output: 00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Device 980a From what I see, I have a AMD C70 processor integrated with (or something similar) AMD Radeon HD 6290. Has anyone had the same problem? Do you know which drivers need to be installed for the graphics to work properly? On official Acer page there are only drivers for Win7 and Win8... Update: I have tried installing fglrx but I get the following error (either I don't have libraries or someone didn't make a clean build before release ;) /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c: In function ‘KCL_MEM_AllocLinearAddrInterval’: /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:2124:5: error: implicit declaration of function ‘do_mmap’ [-Werror=implicit-function-declaration] /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:2124:13: warning: cast to pointer from integer of different size [-Wint-to-pointer-cast] /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c: In function ‘kasInitExecutionLevels’: /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: error: ‘cpu_possible_map’ undeclared (first use in this function) /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: note: each undeclared identifier is reported only once for each function it appears in /lib/modules/fglrx/build_mod/2.6.x/firegl_public.c:4159:5: warning: left-hand operand of comma expression has no effect [-Wunused-value] Update 2: After fixing the erros in compilation, ubuntu acts bizarre and unstable (no left icon panel, no upper panel, cannot run any programs, I only see desktop)

    Read the article

  • Rendering of graphics different depending on DisplayObject position

    - by jedierikb
    When drawing vertical lines with a non-integer x-value (e.g., 1.75) to a sprite, the lines are drawn differently based on the non-integer x-value of the sprite. In the picture below are two pairs of very close together vertical lines. As you can see, they look very different. This is frustrating, especially when animating the sprite. Any ideas how ensure that sprites-with-non-integer-positions' graphics will visually display the same way regardless of the sprite position? package { import flash.display.Sprite; import flash.display.StageAlign; import flash.display.StageScaleMode; import flash.events.Event; public class tmp extends Sprite { private var _sp1:Sprite; private var _sp2:Sprite; public function tmp( ):void { stage.align = StageAlign.TOP_LEFT; stage.scaleMode = StageScaleMode.NO_SCALE; _sp1 = new Sprite( ); drawButt( _sp1 ); _sp1.x = 100; _sp1.y = 100; _sp2 = new Sprite( ); drawButt( _sp2 ); _sp2.x = 100; _sp2.y = 200; addChild( _sp1 ); addChild( _sp2 ); addEventListener( Event.ENTER_FRAME, efCb, false, 0, true ); } private function efCb( evt:Event ):void { var nx:Number = _sp2.x + .1; if (nx > 400) { nx = 100; } _sp2.x = nx; } private function drawButt( sp:Sprite ):void { sp.graphics.clear( ); sp.graphics.lineStyle( 1, 0, 1, true ); sp.graphics.moveTo( 1, 1 ); sp.graphics.lineTo( 1, 100 ); sp.graphics.lineStyle( 1, 0, 1, true ); sp.graphics.moveTo( 1.75, 1 ); sp.graphics.lineTo( 1.75, 100 ); } } }

    Read the article

  • why PaintComponent event in Java happen everytime I use its Graphics Event?

    - by Pooya
    Consider this code: public class StateChartPanel extends JPanel { private LightContext LC; public StateChartPanel(LightContext lc){ LC=lc; } public void paintComponent( Graphics G ){ super.paintComponent( G ); LC.DrawStateChart((Graphics2D)G); } } StateChartPanel is a panel to draw something (a state chart). It sends its Graphics object to LC which use it to draw shapes but whenever it draws something the PaintComponent event of StateChartPanel happens again and it causes my application to hang.

    Read the article

  • Ubuntu 12.10 - Graphics driver shows Gallium 0.4 instead of intel HD

    - by nlooije
    I have a hybrid Intel/Nvidia system using bumblebee on Ubuntu 12.10 with specs: Clevo W150HR, SB i7-2720qm, 8GB RAM, 128GB Crucial M4 SSD + 500GB HDD It is reported to use software rendering (through Gallium driver) rather than accelerated through the intel HD driver. Reinstalling the intel driver has no effect: sudo apt-get install --reinstall xserver-xorg-video-intel The result is a rather sluggish desktop. Is there any way to blacklist the gallium driver? Or force the intel driver instead? Edit 1 - Ubuntu 12.04 shows the driver as Intel SandyBridge out of the box, but 12.10 does not even after running the above command. Edit 2 - Xorg.0.log shows: (--) intel(0): Integrated Graphics Chipset: Intel(R) Sandybridge Mobile (GT1) (WW) intel(0): Disabling hardware acceleration on this pre-production hardware. Edit 3 - It turns out that SandyBridge rev.7 systems are known to be unstable per this link. Accordingly in the xserver-xorg-video-intel if it detects this rev. it disables it with the warning in above log.

    Read the article

  • Often "running in low-graphics mode” with NVidia GeForce GT 430

    - by user42228
    Often (every fifth time perhaps) when I turn on my system, I'm greeted with the "running in low-graphics mode” window. Sometimes all I have to do is reboot and the system will start correctly, sometimes there can be a longer series of failed starts. If I go to the terminal and log in, I can run startx and my desktop will appear without problems. At the end of /var/log/Xorg.1.log I see this: [ 77.459] (EE) Screen(s) found, but none have a usable configuration. [ 77.459] Fatal server error: [ 77.459] no screens found [ 77.459] Please consult the The X.Org Foundation support at http://wiki.x.org Is the NVidia creating the appropriate XOrg configuration dynamically on every boot, or what could explain that the configuration error above only occurs sometimes?

    Read the article

  • iPhone image asset recommended resolution/dpi/format

    - by Matthew
    I'm learning iPhone development and a friend will be doing the graphics/animation. I'll be using cocos2d most likely (if that matters). My friend wants to get started on the graphics, and I don't know what image resolution or dpi or formats are recommended. This probably depends on if something is a background vs. a small character. Also, I know I read something about using @2x in image file names to support high res iphone screens. Does cocos2d prefer a different way? Or is this not something to worry about at this point? What should I know before they start working on the graphics?

    Read the article

  • Basic shadow mapping fails on NVIDIA card?

    - by James
    Recently I switched from an AMD Radeon HD 6870 card to an (MSI) NVIDIA GTX 670 for performance reasons. I found however that my implementation of shadow mapping in all my applications failed. In a very simple shadow POC project the problem appears to be that the scene being drawn never results in a draw to the depth map and as a result the entire depth map is just infinity, 1.0 (Reading directly from the depth component after draw (glReadPixels) shows every pixel is infinity (1.0), replacing the depth comparison in the shader with a comparison of the depth from the shadow map with 1.0 shadows the entire scene, and writing random values to the depth map and then not calling glClear(GL_DEPTH_BUFFER_BIT) results in a random noisy pattern on the scene elements - from which we can infer that the uploading of the depth texture and comparison within the shader are functioning perfectly.) Since the problem appears almost certainly to be in the depth render, this is the code for that: const int s_res = 1024; GLuint shadowMap_tex; GLuint shadowMap_prog; GLint sm_attr_coord3d; GLint sm_uniform_mvp; GLuint fbo_handle; GLuint renderBuffer; bool isMappingShad = false; //The scene consists of a plane with box above it GLfloat scene[] = { -10.0, 0.0, -10.0, 0.5, 0.0, 10.0, 0.0, -10.0, 1.0, 0.0, 10.0, 0.0, 10.0, 1.0, 0.5, -10.0, 0.0, -10.0, 0.5, 0.0, -10.0, 0.0, 10.0, 0.5, 0.5, 10.0, 0.0, 10.0, 1.0, 0.5, ... }; //Initialize the stuff used by the shadow map generator int initShadowMap() { //Initialize the shadowMap shader program if (create_program("shadow.v.glsl", "shadow.f.glsl", shadowMap_prog) != 1) return -1; const char* attribute_name = "coord3d"; sm_attr_coord3d = glGetAttribLocation(shadowMap_prog, attribute_name); if (sm_attr_coord3d == -1) { fprintf(stderr, "Could not bind attribute %s\n", attribute_name); return 0; } const char* uniform_name = "mvp"; sm_uniform_mvp = glGetUniformLocation(shadowMap_prog, uniform_name); if (sm_uniform_mvp == -1) { fprintf(stderr, "Could not bind uniform %s\n", uniform_name); return 0; } //Create a framebuffer glGenFramebuffers(1, &fbo_handle); glBindFramebuffer(GL_FRAMEBUFFER, fbo_handle); //Create render buffer glGenRenderbuffers(1, &renderBuffer); glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer); //Setup the shadow texture glGenTextures(1, &shadowMap_tex); glBindTexture(GL_TEXTURE_2D, shadowMap_tex); glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, s_res, s_res, 0, GL_DEPTH_COMPONENT, GL_FLOAT, NULL); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); return 0; } //Delete stuff void dnitShadowMap() { //Delete everything glDeleteFramebuffers(1, &fbo_handle); glDeleteRenderbuffers(1, &renderBuffer); glDeleteTextures(1, &shadowMap_tex); glDeleteProgram(shadowMap_prog); } int loadSMap() { //Bind MVP stuff glm::mat4 view = glm::lookAt(glm::vec3(10.0, 10.0, 5.0), glm::vec3(0.0, 0.0, 0.0), glm::vec3(0.0, 1.0, 0.0)); glm::mat4 projection = glm::ortho<float>(-10,10,-8,8,-10,40); glm::mat4 mvp = projection * view; glm::mat4 biasMatrix( 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.5, 0.5, 0.5, 1.0 ); glm::mat4 lsMVP = biasMatrix * mvp; //Upload light source matrix to the main shader programs glUniformMatrix4fv(uniform_ls_mvp, 1, GL_FALSE, glm::value_ptr(lsMVP)); glUseProgram(shadowMap_prog); glUniformMatrix4fv(sm_uniform_mvp, 1, GL_FALSE, glm::value_ptr(mvp)); //Draw to the framebuffer (with depth buffer only draw) glBindFramebuffer(GL_FRAMEBUFFER, fbo_handle); glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer); glBindTexture(GL_TEXTURE_2D, shadowMap_tex); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, shadowMap_tex, 0); glDrawBuffer(GL_NONE); glReadBuffer(GL_NONE); GLenum result = glCheckFramebufferStatus(GL_FRAMEBUFFER); if (GL_FRAMEBUFFER_COMPLETE != result) { printf("ERROR: Framebuffer is not complete.\n"); return -1; } //Draw shadow scene printf("Creating shadow buffers..\n"); int ticks = SDL_GetTicks(); glClear(GL_DEPTH_BUFFER_BIT); //Wipe the depth buffer glViewport(0, 0, s_res, s_res); isMappingShad = true; //DRAW glEnableVertexAttribArray(sm_attr_coord3d); glVertexAttribPointer(sm_attr_coord3d, 3, GL_FLOAT, GL_FALSE, 5*4, scene); glDrawArrays(GL_TRIANGLES, 0, 14*3); glDisableVertexAttribArray(sm_attr_coord3d); isMappingShad = false; glBindFramebuffer(GL_FRAMEBUFFER, 0); printf("Render Sbuf in %dms (GLerr: %d)\n", SDL_GetTicks() - ticks, glGetError()); return 0; } This is the full code for the POC shadow mapping project (C++) (Requires SDL 1.2, SDL-image 1.2, GLEW (1.5) and GLM development headers.) initShadowMap is called, followed by loadSMap, the scene is drawn from the camera POV and then dnitShadowMap is called. I followed this tutorial originally (Along with another more comprehensive tutorial which has disappeared as this guy re-configured his site but used to be here (404).) I've ensured that the scene is visible (as can be seen within the full project) to the light source (which uses an orthogonal projection matrix.) Shader utilities function fine in non-shadow-mapped projects. I should also note that at no point is the GL error state set. What am I doing wrong here and why did this not cause problems on my AMD card? (System: Ubuntu 12.04, Linux 3.2.0-49-generic, 64 bit, with the nvidia-experimental-310 driver package. All other games are functioning fine so it's most likely not a card/driver issue.)

    Read the article

  • Unable to install on a Samsung 305v5a

    - by Antony
    Have used Ubuntu for years now. Bought a Samsung 305v5a-so2 laptop yesterday. It runs an AMD A8 quadcore. I have a CD of 10.04 and as I am not clear about whether to install 32 or 64 bit I thought I would run the trial of ubuntu from the cd to see it. After about 30m started getting Authentification Failure messages. Squashfs-error Unable to read fragment cash entry Then a zillion Buffer Logical error messages like 17,000+ Should I go download 11.10, in 32bit or go and try the 64bit. Really don't want to screw the new laptop already but aint gonna wanna work with w7 either. Thanks for any help

    Read the article

  • Rebooting problem

    - by Vinoth
    After restarting my ubuntu PC, I am getting a dialog window showing "Your system is running in low graphics mode" and after clicking on 'OK', I was prompted with three options 1. Run in low graphics mode for just one session 2. Reconfigure graphics 3. Troubleshoot the error 4. Exit to console login When I troubleshoot the error, its showing the black screen with following lines which I remember: we do not own /var/log/pm-powersave.log ....................................... couldn't write bytes: Broken pipe ....................................... ....................................... Starting System V runlevel compatibility got strucked at this point. Actually before restarting, I have changed the owner for all the files under the directory "/var/log" and "/var/lib". Is there any relation with the above issue. Please help me out to resolve the issue. Its urgent.

    Read the article

  • ATI Catalyst doesn't retain changes after reboot when setting extended display

    - by rfc1484
    I have Ubuntu 11.10 and I'm trying to set up extended display for my two displays. I have an AMD 6870. fglrx and fglrx-updates are installed. When I launch amdcccle trough the terminal (using sudo), I select in tab "Multi-Display" the option "Multi-display Desktop with Display(s)". Then it says for changes to be done I have to reboot my computer. Being a good an obedient lad I do just that, but after rebooting the displays are still in the same "clone" option as before in the Catalyst Control Center and no changes are made. Any suggestions?

    Read the article

  • unmet dependencies updating Intel graphics driver

    - by Nick Jacobs
    I am trying to update my graphics driver by running: sudo apt-get install xserver-xorg-video-intel But I keep getting this message: Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: xserver-xorg-video-intel : Depends: xorg-video-abi-11 E: Unable to correct problems, you have held broken packages. When I run: sudo apt-get install -f Nothing happens, because the program doesn't even get half way to installing. I am running Lubuntu 12.04 on a HP Mini 110 netbook. I have tried changing to several different ppa's but that doesn't work. I want to update my driver so that my mouse stops flickering and ScummVM stops flickering every minute.

    Read the article

  • "The system is running in low graphics mode", unable to connect to the Internet

    - by Blah
    When I try to log onto Ubuntu, I get a notice that reads: The system is running in low graphics mode. I searched for a solution and there are a slew of answers. However, many of them seem to suggest reinstalling different things. After pressing Ctrl+Alt+F1 and getting to a terminal, I attempt commands like sudo apt-get install example. Upon doing so I get notices that say things such as could not resolve host which I am assuming means I'm not connected to the Internet. Since I can not connect to the Internet, none of these responses are helpful to me. What can I do?

    Read the article

  • After reinstalling ATI graphics drivers, my keyboard and mous aren't working anymore

    - by Lifelike27
    On Ubuntu 11.04 I tried to install the open source drivers for my laptops ATI Mobility Radeon 5470M but I messed that up a bit and lost xserver. Now I've managed to solve that problem with this and by downloading the ATI proprietary drivers and install those manually. Now, when Ubuntu loads up I get to the login screen but I can't use my mouse or my keyboard (usb keyboard and mouse doesn't work either). If I use the recovery console, then login with that and then run 'startx'. I can login fine (though Unity doesn't show, the graphics seem to be working because it shows the fading animation of libnotify), but I can't type or move my mouse.

    Read the article

  • Ubuntu 13.04/13.10 - low graphics mode/black screen after login

    - by Richard Stokes
    Since last week, my Ubuntu 13.04 installation started booting up in low graphics mode out of nowhere. I tried a number of solutions found on this site but to no avail. The other night, I decided to upgrade to 13.10 to see if the issue would be fixed in the process. Since then, the system starts fine, but when I login to my account, I'm just presented with a black screen. I switched to the terminal and did any necessary updates, but to no avail. I don't even know where to start debugging this issue, so I'd appreciate if someone could point me towards the first place to start looking?

    Read the article

  • xrandr shows VGA1 as disconnected

    - by Felix
    I have a Thinkpad W520 with Nvidia Optimus graphics. I have disabled the Nvidia card in BIOS (by selecting "integrated graphics"), so I'm running only on the integrated Intel graphics. I get full 3D acceleration, which would suggest the drivers are properly installed. However, I'm not able to use an external monitor. With the external monitor connected and turned on, running xrandr always gives: $ xrandr Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192 LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm 1920x1080 60.0*+ 59.9 50.0 1680x1050 60.0 59.9 1600x1024 60.2 1400x1050 60.0 1280x1024 60.0 1440x900 59.9 1280x960 60.0 1360x768 59.8 60.0 1152x864 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) What gives? It sees the VGA1 port (to which the external display is connected), but it appears disconnected. I have tried forcing a resolution as per these instructions, but when I do that X becomes unresponsive and I have to Ctrl-Alt-F1 and restart it.

    Read the article

  • Depending on another open source library: copy/paste code or include

    - by user5794
    I'm working on a large class and started implementing new features that need graphics. I started writing the graphics functions myself, but I know that open source libraries exist that can provide me with this functionality without me having to write it myself. The problem is that I prefer the class to be self-sufficient and not dependent on any other library. If I don't write it myself, I would have to ask the user to make sure a graphics library is already installed (less user-friendly). If I write it myself, I do a lot more work than I have to. I could also copy/paste some of the relevant code into my own class, but not sure about the disadvantages of doing this (it's an open source library that matches my license, so I'm not concerned with legality, just programming-wise if there are disadvantages). So what should I do: copy paste code from the external library write the code myself so it's truly self-sufficient ask the user to download and install another library

    Read the article

  • After reinstalling ATI graphics drivers, my keyboard and mouse aren't working anymore

    - by Lifelike27
    On Ubuntu 11.04 I tried to install the open source drivers for my laptops ATI Mobility Radeon 5470M but I messed that up a bit and lost xserver. Now I've managed to solve that problem with this and by downloading the ATI proprietary drivers and install those manually. Now, when Ubuntu loads up I get to the login screen but I can't use my mouse or my keyboard (usb keyboard and mouse doesn't work either). If I use the recovery console, then login with that and then run 'startx'. I can login fine (though Unity doesn't show, the graphics seem to be working because it shows the fading animation of libnotify), but I can't type or move my mouse.

    Read the article

  • Bumblebee optirun appears to depend on Intel

    - by user206398
    I have a Lenovo T420 with Intel and Nvidia graphics. On upgrade to Ubuntu Saucy, I had to purge and reinstall bumblebee-nvidia to get beyond optirun failing to find a GPU driver. Now, "optirun glxgears" and "optirun sol" succeed, but optirun fails on 2 Virtual Life viewers that it supported in the past, Cool VL (CoolVLViewer-1.26.8.34-Linux-x86) and Imprudence (Imprudence 1.4.0 beta2). In both cases, the error output is huge, but it starts with libGL error: failed to load driver: i965 and libGL error: failed to load driver: swrast From the little I can discover, i965 is an Intel graphics driver, which should not be invoked at all. I haven't found any information about swrast. I suspect that some of the X configuration associated with Bumblebee has some Intel dependence that is invoked on certain library calls, but not others. I haven't discovered any definite information on this line. The Cool VL Viewer runs without optirun, but complains about the insufficiency of the Intel graphics.

    Read the article

  • kubuntu 12.04 runs slowly

    - by randy
    i have a 3ghz amd 64 dual core am3 socket processor, asus mom board, 4GB ram, 8800GTS nividia video card with 500MB ram. installed kubuntu 12.04 and very laggy. select menu and 20 seconds later the menu pops up. i switched to classic menu and that seemed to help. what direction should i look first to get this running better? video perhaps? i had ubuntu 11.04 on this machine previously and had no problems with speed.

    Read the article

  • lightdm.conf content erased, now stuck in low graphics mode

    - by user79318
    This evening I was attempting to disable the guest account and something went awry. Currently on boot Ubuntu enters low graphics mode. No specific error report. What did I do? Before this error occurred I added a line of code in lightdm.conf to disable the guest account. I think I may have accidentally erased the contents of lightdm.conf. Not entirely sure. I troubleshooted for the past hour using various suggestions from other Questions to no avail.

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >