Search Results

Search found 1987 results on 80 pages for 'nvidia optimus'.

Page 18/80 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Strange ATI vs Nvidia TRIANGLE_STRIP issue

    - by chriscisco
    I have this code, I am using a test for my Engine I am working on. On My NVIDIA NVS 4200M it displays the GL_TRIANGLE_STRIP as expected. On my ATI Radeon 5800 it appears to draw a Triangle. shader.begin(); Matrix4<float> temp = getActiveCamera()->getProjectionMatrix() * getActiveCamera()->getObjectToWorld().fastInverse(); glUniformMatrix4fv(shader["mvp"], 1, GL_TRUE, temp.getArray()); glBegin(GL_TRIANGLE_STRIP); glVertexAttrib3f(shader["colour"],0,1,0); glVertexAttrib3f(shader["coord3d"],-.5,-.5,0); glVertexAttrib3f(shader["colour"],1,1,0); glVertexAttrib3f(shader["coord3d"],0.5,-.5,0); glVertexAttrib3f(shader["colour"],1,0,1); glVertexAttrib3f(shader["coord3d"],-.5,.5,0); glVertexAttrib3f(shader["colour"],0,1,1); glVertexAttrib3f(shader["coord3d"],.5,.5,0); glEnd(); shader.end(); Here are what it actually looks like on my two computers. https://www.dropbox.com/s/sgm2j978tx2ipnp/not%20working.png https://www.dropbox.com/s/27idv0b8k0p4pcx/working.png

    Read the article

  • Triple-head on a Lenovo T520

    - by codeape
    Lenovo T520 with integrated Intel HD graphics + a NVidia card (Optimus) Ubuntu 11.10 on the computer. I would like to use the built-in screen plus two external screens. This PDF indicates that it is possible to connect up to four external monitors to the laptop. The information is Windows only. I was planning to disable the NVidia card, since I have read that Linux support for Optimus is not good. Questions: Has anyone set up three monitors on NVidia hardware? Has anyone set up three monitors using Intel HD 3000? Can I expect it to work out of the box, or are there tricks I need to be aware of?

    Read the article

  • is bumblebee supposed to be a "seamless" implementation?

    - by broiyan
    It is said that bumblebee is used to make Nvidia Optimus work on linux. The wikipedia Optimus description says that The switching is designed to be completely seamless and to happen "behind the scenes". However, after installing bumblebee, the recommended test seems to be to compare the operation of glxspheres with the operation of optirun glxspheres. The latter is faster than the former. If the switching is supposed to be seamless, why do we optirun to speed up graphics? Shouldn't it be automatic?

    Read the article

  • Should I switch my graphics mode in the BIOS to avoid using Bumblebee?

    - by Fawkes5
    I have just purchased a Acer 3830TG, the timeline X series. To my surprise I found out that there is no first-party support for nvidia optimus for linux. Bumblebee works great, but the battery life from the graphics card always running is not so great. I don't use linux for games so i don't really need the graphics card on, I have Windows for that. In my bios, I have the ability to change my graphics mode from switchable to integrated. If I do this, reinstall ubuntu, what will happen? Will my nvidia card just turn off? Will everything work properly, as if i'm not running an optimus laptop? Is this recommended as opposed to dealing with bumblebee? What is the best thing I could do?

    Read the article

  • Can the NVIDIA ION chipset handle streaming and gaming reasonable well?

    - by true_gritt
    I'm considering getting a small-footprint "nettop" computer to use as a home theater PC with my Samsung LN40A550 HDTV. I've been looking at systems like the AS Rock ION HT330, the Acer AspireRevo 3610, or the Asus EeeBox PC EB1501. These are all systems with NVIDIA ION chipset (Intel Atom N330 dual core CPU + NVIDIA GeForce 9400 GPU). Is the NVIDIA ION chipset powerful enough to support media streaming at HD resolutions (e.g. via Boxee, Hulu, Netflix) and casual gaming (e.g. World of Warcraft, Madden NFL) reasonably well without herky-jerky video output?

    Read the article

  • NVIDIA same chipset, but different implementations - what is the difference?

    - by Horst Walter
    I have planned to buy a graphics card. When searching for a particular chipset (e.g. GTX 460) I find cards of different vendors (i.e. Gigabyte, Palit, PNY, ...). I can figure out differences in frequency, memory, and equipment. When I read test reports, usually a particular NVIDIA card is compared with its ATI/AMD "counterpart" - have not really found a comparison of all vendors for a particular NVIDIA chipset. So in order to make a decision: a) Are the drivers all the same for all the cards of a particular chipset (and provided by NVIDIA or the vendor?) b) How to figure out which card actually to buy. OK, I choose chipset, and memory, and check the card has the required ports, but then ....

    Read the article

  • Kepler, la nouvelle architecture GPU de NVIDIA : présentation de la technologie et de ses performances

    Kepler, la nouvelle architecture de processeur graphique de NVIDIA Présentation des nouvelles technologies et des performances Annoncée depuis plusieurs mois, la nouvelle architecture de carte graphique de NVIDIA est officiellement annoncée la semaine dernière. Cette nouvelle architecture est destinée à concurrencer la nouvelle architecture de AMD sortie le mois dernier. La première carte de cette gamme se nomme GTX 680 et est basée sur la puce GK104. Pour la génération précédente (architecture FERMI), NVIDIA s'était focalisé sur l'ajout de la tessellation et l'amélioration des performances. Pour Kepler, NVIDIA a travaillé principalement sur la consommation d'énergie : gravure 28nm, nouveaux SMX, GP...

    Read the article

  • Updating Dell Vostro 3700 (Nvidia GeForce GT330M) display driver?

    - by iRubens
    I've bought this laptop "Dell Vostro 3700", having inside an Intel integrated graphic card and an Nvidia GeForce GT330M. Depending on energy saving mode it switches between the two video cards. When I try to update the video driver (now version 189.99 on Windows 7 64-bit) with that found on Nvidia site an error message say that it cannot find compatible graphic hardware. Dell doesn't provide a newer driver version. Has anyone solved the same problem?

    Read the article

  • What are the implications of Nvidia's "the way it's meant to be played"?

    - by Mike Pateras
    I have an AMD Radeon 5850 (about to be 2), and today I read that Rift is a member of Nvidia's "the way it's meant to be played" program. It was suggested that as such the developers would not be speaking with or working directly with AMD for optimization, and that it would be unlikely that Crossfire support would be added until the game's release. Are any of these implications likely? Or does it just mean that Nvidia is working closely with the developers for optimization and marketing support?

    Read the article

  • Unity won't load with proprietary drivers

    - by Nobita
    First time running Ubuntu 11.04 and getting used to Unity, I decided to install proprietary drivers for my Nvidia graphic card. The output of lspci | grep VGA is: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: nVidia Corporation Device 0df5 (rev a1) If I activate the driver that is "recommended", next time I try to login in a Unity session it just changes to the classic. How can that be happening? I attach the screenshoot of my proprietary driver screen:

    Read the article

  • NVIDIA CUDA SDK Examples Compilation Unsupported Architecture 'computer_20'

    - by Andrew Bolster
    On compilation of the CUDA SDK, I'm getting a nvcc fatal : Unsupported gpu architecture 'compute_20' My toolkit is 2.3 and on a shared system (i.e cant really upgrade) and the driver version is also 2.3, running on 4 Tesla C1060s If it helps, the problem is being called in radixsort. It appears that a few people online have had this problem but i havent found anywhere that actually gives a solution.

    Read the article

  • OpenCL: Strange buffer or image bahaviour with NVidia but not Amd

    - by Alex R.
    I have a big problem (on Linux): I create a buffer with defined data, then an OpenCL kernel takes this data and puts it into an image2d_t. When working on an AMD C50 (Fusion CPU/GPU) the program works as desired, but on my GeForce 9500 GT the given kernel computes the correct result very rarely. Sometimes the result is correct, but very often it is incorrect. Sometimes it depends on very strange changes like removing unused variable declarations or adding a newline. I realized that disabling the optimization will increase the probability to fail. I have the most actual display driver in both systems. Here is my reduced code: #include <CL/cl.h> #include <string> #include <iostream> #include <sstream> #include <cmath> void checkOpenCLErr(cl_int err, std::string name){ const char* errorString[] = { "CL_SUCCESS", "CL_DEVICE_NOT_FOUND", "CL_DEVICE_NOT_AVAILABLE", "CL_COMPILER_NOT_AVAILABLE", "CL_MEM_OBJECT_ALLOCATION_FAILURE", "CL_OUT_OF_RESOURCES", "CL_OUT_OF_HOST_MEMORY", "CL_PROFILING_INFO_NOT_AVAILABLE", "CL_MEM_COPY_OVERLAP", "CL_IMAGE_FORMAT_MISMATCH", "CL_IMAGE_FORMAT_NOT_SUPPORTED", "CL_BUILD_PROGRAM_FAILURE", "CL_MAP_FAILURE", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "CL_INVALID_VALUE", "CL_INVALID_DEVICE_TYPE", "CL_INVALID_PLATFORM", "CL_INVALID_DEVICE", "CL_INVALID_CONTEXT", "CL_INVALID_QUEUE_PROPERTIES", "CL_INVALID_COMMAND_QUEUE", "CL_INVALID_HOST_PTR", "CL_INVALID_MEM_OBJECT", "CL_INVALID_IMAGE_FORMAT_DESCRIPTOR", "CL_INVALID_IMAGE_SIZE", "CL_INVALID_SAMPLER", "CL_INVALID_BINARY", "CL_INVALID_BUILD_OPTIONS", "CL_INVALID_PROGRAM", "CL_INVALID_PROGRAM_EXECUTABLE", "CL_INVALID_KERNEL_NAME", "CL_INVALID_KERNEL_DEFINITION", "CL_INVALID_KERNEL", "CL_INVALID_ARG_INDEX", "CL_INVALID_ARG_VALUE", "CL_INVALID_ARG_SIZE", "CL_INVALID_KERNEL_ARGS", "CL_INVALID_WORK_DIMENSION", "CL_INVALID_WORK_GROUP_SIZE", "CL_INVALID_WORK_ITEM_SIZE", "CL_INVALID_GLOBAL_OFFSET", "CL_INVALID_EVENT_WAIT_LIST", "CL_INVALID_EVENT", "CL_INVALID_OPERATION", "CL_INVALID_GL_OBJECT", "CL_INVALID_BUFFER_SIZE", "CL_INVALID_MIP_LEVEL", "CL_INVALID_GLOBAL_WORK_SIZE", }; if (err != CL_SUCCESS) { std::stringstream str; str << errorString[-err] << " (" << err << ")"; throw std::string(name)+(str.str()); } } int main(){ try{ cl_context m_context; cl_platform_id* m_platforms; unsigned int m_numPlatforms; cl_command_queue m_queue; cl_device_id m_device; cl_int error = 0; // Used to handle error codes clGetPlatformIDs(0,NULL,&m_numPlatforms); m_platforms = new cl_platform_id[m_numPlatforms]; error = clGetPlatformIDs(m_numPlatforms,m_platforms,&m_numPlatforms); checkOpenCLErr(error, "getPlatformIDs"); // Device error = clGetDeviceIDs(m_platforms[0], CL_DEVICE_TYPE_GPU, 1, &m_device, NULL); checkOpenCLErr(error, "getDeviceIDs"); // Context cl_context_properties properties[] = { CL_CONTEXT_PLATFORM, (cl_context_properties)(m_platforms[0]), 0}; m_context = clCreateContextFromType(properties, CL_DEVICE_TYPE_GPU, NULL, NULL, NULL); // m_private->m_context = clCreateContext(properties, 1, &m_private->m_device, NULL, NULL, &error); checkOpenCLErr(error, "Create context"); // Command-queue m_queue = clCreateCommandQueue(m_context, m_device, 0, &error); checkOpenCLErr(error, "Create command queue"); //Build program and kernel const char* source = "#pragma OPENCL EXTENSION cl_khr_byte_addressable_store : enable\n" "\n" "__kernel void bufToImage(__global unsigned char* in, __write_only image2d_t out, const unsigned int offset_x, const unsigned int image_width , const unsigned int maxval ){\n" "\tint i = get_global_id(0);\n" "\tint j = get_global_id(1);\n" "\tint width = get_global_size(0);\n" "\tint height = get_global_size(1);\n" "\n" "\tint pos = j*image_width*3+(offset_x+i)*3;\n" "\tif( maxval < 256 ){\n" "\t\tfloat4 c = (float4)(in[pos],in[pos+1],in[pos+2],1.0f);\n" "\t\tc.x /= maxval;\n" "\t\tc.y /= maxval;\n" "\t\tc.z /= maxval;\n" "\t\twrite_imagef(out, (int2)(i,j), c);\n" "\t}else{\n" "\t\tfloat4 c = (float4)(255.0f*in[2*pos]+in[2*pos+1],255.0f*in[2*pos+2]+in[2*pos+3],255.0f*in[2*pos+4]+in[2*pos+5],1.0f);\n" "\t\tc.x /= maxval;\n" "\t\tc.y /= maxval;\n" "\t\tc.z /= maxval;\n" "\t\twrite_imagef(out, (int2)(i,j), c);\n" "\t}\n" "}\n" "\n" "__constant sampler_t imageSampler = CLK_NORMALIZED_COORDS_FALSE | CLK_ADDRESS_CLAMP_TO_EDGE | CLK_FILTER_NEAREST;\n" "\n" "__kernel void imageToBuf(__read_only image2d_t in, __global unsigned char* out, const unsigned int offset_x, const unsigned int image_width ){\n" "\tint i = get_global_id(0);\n" "\tint j = get_global_id(1);\n" "\tint pos = j*image_width*3+(offset_x+i)*3;\n" "\tfloat4 c = read_imagef(in, imageSampler, (int2)(i,j));\n" "\tif( c.x <= 1.0f && c.y <= 1.0f && c.z <= 1.0f ){\n" "\t\tout[pos] = c.x*255.0f;\n" "\t\tout[pos+1] = c.y*255.0f;\n" "\t\tout[pos+2] = c.z*255.0f;\n" "\t}else{\n" "\t\tout[pos] = 200.0f;\n" "\t\tout[pos+1] = 0.0f;\n" "\t\tout[pos+2] = 255.0f;\n" "\t}\n" "}\n"; cl_int err; cl_program prog = clCreateProgramWithSource(m_context,1,&source,NULL,&err); if( -err != CL_SUCCESS ) throw std::string("clCreateProgramWithSources"); err = clBuildProgram(prog,0,NULL,"-cl-opt-disable",NULL,NULL); if( -err != CL_SUCCESS ) throw std::string("clBuildProgram(fromSources)"); cl_kernel kernel = clCreateKernel(prog,"bufToImage",&err); checkOpenCLErr(err,"CreateKernel"); cl_uint imageWidth = 8; cl_uint imageHeight = 9; //Initialize datas cl_uint maxVal = 255; cl_uint offsetX = 0; int size = imageWidth*imageHeight*3; int resSize = imageWidth*imageHeight*4; cl_uchar* data = new cl_uchar[size]; cl_float* expectedData = new cl_float[resSize]; for( int i = 0,j=0; i < size; i++,j++ ){ data[i] = (cl_uchar)i; expectedData[j] = (cl_float)i/255.0f; if ( i%3 == 2 ){ j++; expectedData[j] = 1.0f; } } cl_mem inBuffer = clCreateBuffer(m_context,CL_MEM_READ_ONLY|CL_MEM_COPY_HOST_PTR,size*sizeof(cl_uchar),data,&err); checkOpenCLErr(err, "clCreateBuffer()"); clFinish(m_queue); cl_image_format imgFormat; imgFormat.image_channel_order = CL_RGBA; imgFormat.image_channel_data_type = CL_FLOAT; cl_mem outImg = clCreateImage2D( m_context, CL_MEM_READ_WRITE, &imgFormat, imageWidth, imageHeight, 0, NULL, &err ); checkOpenCLErr(err,"get2DImage()"); clFinish(m_queue); size_t kernelRegion[]={imageWidth,imageHeight}; size_t kernelWorkgroup[]={1,1}; //Fill kernel with data clSetKernelArg(kernel,0,sizeof(cl_mem),&inBuffer); clSetKernelArg(kernel,1,sizeof(cl_mem),&outImg); clSetKernelArg(kernel,2,sizeof(cl_uint),&offsetX); clSetKernelArg(kernel,3,sizeof(cl_uint),&imageWidth); clSetKernelArg(kernel,4,sizeof(cl_uint),&maxVal); //Run kernel err = clEnqueueNDRangeKernel(m_queue,kernel,2,NULL,kernelRegion,kernelWorkgroup,0,NULL,NULL); checkOpenCLErr(err,"RunKernel"); clFinish(m_queue); //Check resulting data for validty cl_float* computedData = new cl_float[resSize];; size_t region[]={imageWidth,imageHeight,1}; const size_t offset[] = {0,0,0}; err = clEnqueueReadImage(m_queue,outImg,CL_TRUE,offset,region,0,0,computedData,0,NULL,NULL); checkOpenCLErr(err, "readDataFromImage()"); clFinish(m_queue); for( int i = 0; i < resSize; i++ ){ if( fabs(expectedData[i]-computedData[i])>0.1 ){ std::cout << "Expected: \n"; for( int j = 0; j < resSize; j++ ){ std::cout << expectedData[j] << " "; } std::cout << "\nComputed: \n"; std::cout << "\n"; for( int j = 0; j < resSize; j++ ){ std::cout << computedData[j] << " "; } std::cout << "\n"; throw std::string("Error, computed and expected data are not the same!\n"); } } }catch(std::string& e){ std::cout << "\nCaught an exception: " << e << "\n"; return 1; } std::cout << "Works fine\n"; return 0; } I also uploaded the source code for you to make it easier to test it: http://www.file-upload.net/download-3513797/strangeOpenCLError.cpp.html Please can you tell me if I've done wrong anything? Is there any mistake in the code or is this a bug in my driver? Best reagards, Alex

    Read the article

  • No alternative drivers appearing on Software Sources and manual install leads to no unity

    - by Gausie
    I just got a new laptop, installed Ubuntu 12.10 and am trying to install proprietary nvidia drivers. Once I understood the change from jockey, I did a fresh install and followed these instructions: http://techhamlet.com/2012/11/install-nvidia-drivers-in-ubuntu-12-10/ But when I do, Unity crashes on startup. My hardware on lspci | grep VGA is as follows 00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GK107 [GeForce GT 650M] (rev a1) I've followed a couple of nvidia 12.10 tutorials on Google but none have helped. Can I get any specific advice?

    Read the article

  • what packages should I install in ubuntu 12.04 to fulfill opengl requirements for using nouveau driver?

    - by karolszk
    I try to switch from nvidia to nouveau driver via script: !/bin/bash stop gdm rmmod nvidia sed -i "s/nouveau/nvidia/" /etc/modprobe.d/blacklist-nvidia-nouveau.conf update-alternatives --set gl_conf /usr/lib/mesa/ld.so.conf ldconfig modprobe nouveau cp /etc/X11/xorg.conf{.nouveau,} start gdm and driver is loaded and X started but compiz it doesn't. In .xsession-errors I see: Compiz (opengl) - Fatal: Root visual is not a GL visual compiz (opengl) - Error: initScreen failed compiz (core) - Error: Couldn't activate plugin 'opengl' Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual Compiz (opengl) - Fatal: Root visual is not a GL visual gnome-session[19075]: WARNING: App 'compiz.desktop' respawning too quickly gnome-session[19075]: WARNING: Application 'compiz.desktop' killed by signal gnome-session[19075]: WARNING: App 'compiz.desktop' respawning too quickly what I'm doing wrong??

    Read the article

  • Sound problems in Unity - input but no output

    - by ana
    I am new to Ubuntu, having just installed it for the first time on my Lenovo Thinkpad. Since I installed it I have no sound output whatsoever. However, I can see from the graphical interface in Sound Preferences Input that the sound input appears to be working correctly. I have tried the following: https://help.ubuntu.com/community/HdaIntelSoundHowto https://wiki.ubuntu.com/Audio/InstallingLinuxAlsaDriverModules ubuntu-bug audio I have two sound cards, cat card0/codec* | grep Codec Codec: Conexant CX20585 Codec: Conexant ID 2c06 cat card1/codec* | grep Codec Codec: Nvidia GPU 0b HDMI/DP Codec: Nvidia GPU 0b HDMI/DP Codec: Nvidia GPU 0b HDMI/DP Codec: Nvidia GPU 0b HDMI/DP And now have pretty much run out of ideas. Can anybody help?

    Read the article

  • HDMI port not recognized on Sony Vaio

    - by julio
    I am running Ubuntu 11.10 64bit with a Sony VAIO VPC F11. It has an NVIDIA GeForce 310M video card, with the latest Nvidia drivers for the 64 bit linux, and a Windows partition with Win7 64bit. NVIDIA driver version is NVIDIA-Linux-x86_64-280.13 External monitor is Samsung SyncMaster P2770 If I boot into the Windows partition, the HDMI works as expected, with sound and video-- under linux, the HDMI port is not recognized at all, apparently, and provides no signal to the attached monitor. The nividia-settings tool does not recognize any monitor connected to the HDMI port. Disper is installed and cannot recognize an attached external monitor. Can anyone help me diagnose this issue and fix it if possible? The laptop has only the one HDMI port to connect any external monitor, so it I can't get this working I'm stuck using either the laptop screen or Windows. Thanks

    Read the article

  • Does an onboard video affect the X windows configuration?

    - by Timothy
    Does the onboard video on the motherboard affect the X windows configuration? My system has onboard and pcie video. The onboard video is a NVIDIA GeForce 7025 GPU, On Board Graphic Max. Memory Share Up to 512MB(Under OS By Turbo Cache). I have a pcie dual head video card installed with two monitors. The video card is a GeForce 8400 GS, with 512mb memory. When installing Ubuntu 12.04, only one monitor worked. When pulling up system settings- Displays it shows a laptop. This is a desktop pc. I did get both monitors to work using nvidia using twinview -- A complicated process! When checking nvidia now it shows the monitors disabled. The Nvidia X server setting does show the GPU and all the information. I was thinking it's seeing the onboard video on the motherboard. Why else would it show laptop?

    Read the article

  • how do I get dual monitors to work properly in Ubuntu 11.10 on a Dell Latitude D630?

    - by wes cook
    I have spent a lot of time trying to get dual monitors to work on Ubuntu 11.10 on my Dell Latitude D630 (nVidia NVS 135m video card). - For starters, the System Displays settings app always only showed one unknown monitor, even though I had the external Acer monitor connected. - So I downloaded and installed the nVidia drivers. According to what I read I would need to only use the nVidia driver app (nVidia X Server Settings), so that's what I've done. (System Displays settings continued to only show a single monitor anyway). - nVidia settings app only showed on monitor until I changed the BIOS setting to use the onboard video for external monitor (not the dock video, which it was set to, even though I don't have a docking station). - The nVidia setting app now recognized both monitors. So, I setup the X Server display config as Separate X screen for both monitors. My laptop screen shows up as AUO 1440x900 and my external monitor as Acer E211H 1920x1080. - Everything seemed like it would work, but the external monitor was just a complete white screen. The external monitor was non-functional, even though sometimes it would show the background image - still nothing would show up over there. - So, I checked the Enable Xinerama box. - Now, after logging out and back in, the wallpaper extends to both screens but I get no taskbar at the bottom or top, no system menus, and I have to press the power button to restart or log off. - After experimenting with all the shells, the only one that shows the menus and taskbars when I log in is Gnome Classic. - This is pretty much the same symptoms as found here: How do I fix 11.10 GUI?. - So, I resign myself to the older shell. - Everything works fine until ... I unplug the external monitor ... this is a laptop after all. - Anyway, after doing some work on the road, I plug back in and I still see both screens and it's functional except, ... - Now, the laptop screen (with the taskbar and menu bar) has 4 black bars at the top that windows cannot cover. The top bar is the menu bar (with Applications, Places, the date and time and the system menu on the right). But the next 3 bars (the same height as the top menu bar) are empty and are just reducing the max size of windows on that screen. - See screenshot here: http://i39.tinypic.com/35d2kh1.png - So ... 1. How do I get rid of those extra 3 black bars? They're taking valuable screen space. 2. (less critical) How do I successfully use both screens in the Ubuntu or Ubuntu 2D shell?

    Read the article

  • Ubuntu Desktop does not load

    - by Niklas
    If I login on my Ubuntu 14.04, I get the following desktop: This weird behavior appeared after I executed sudo apt-get update && sudo apt-get upgrade and restarted my computer. Don't know why though. To my Ubuntu I have tried the following (nothing seems to work so far) Fix any broken packages: sudo apt-get update sudo apt-get autoclean sudo apt-get clean sudo apt-get autoremove Locate any broken packages and reinstall them: sudo apt-get install debsums sudo apt-get clean sudo debsums_init sudo debsums -cs sudo apt-get install --reinstall $(sudo dpkg -S $(sudo debsums -c) | cut -d : -f 1 | sort -u) Removing some compiz files: rm -r ~/.cache/compizconfig-1 rm -r ~/.compiz Purging of NVIDIA and installing NVIDIA-prime: sudo apt-get install --reinstall ubuntu-desktop sudo apt-get install unity sudo apt-get purge nvidia* bumblebee* sudo apt-get install nvidia-prime sudo shutdown -r now Compizconfig Settings Manager: sudo apt-get install compizconfig-settings-manager export DISPLAY=:0 ccsm // Back to UI and enablement of Unity Plugin Unity replace, which stopped at a while and did nothing afterwards unity --replace Some dconf reset dconf reset -f /org/compiz/ unity --reset-icons &disown Actually dconf did not work and I got this error: error: Cannot autolaunch D-Bus without X11 $DISPLAY Can anybody help me on that? This is my hardware (hope it helps in any way): Intel® Core™ i7-3770 ASUS GTX660TI-DC2-OG-2GD5 (NVIDIA driver is/was installed) ASUS P8Z77-V LX Corsair DIMM 8 GB DDR3-1600 Kit Samsung 830series 2,5" 256 GB (Windows is installed here) Seagate ST31000524AS 1 TB (3/4 are reserved for files; 1/4 is for Ubuntu (16GB swap included))

    Read the article

  • Ubuntu 12.04 nomodeset fixes boot problem but causes screen resolution to get stuck

    - by Thunder
    I've been searching the askubuntu forum for the past 3 days trying to figure out what's going on with my system and I have tried a lot of things but to no avail. So, I will explain my situation and tell you what I have tried and I hope someone can help me :) I have an: HP Workstation xw4100 Pentium(R) 4 3.00 GHz 1.5 GB RAM NVIDIA Quadro4 380 XGL graphics card It came with Windows XP and I set it up (with WUBI) to dual boot with Ubuntu 12.04 After installation I had the problem that so many people had with it booting to a black screen (mine was actually booting to the terminal basic shell) that is fixed by adding nomodeset into the grub. When I do that, MY screen resolution becomes stuck in 1280x768 (as opposed to 1366x768 before adding nomodeset) (and also, when running XP the best resolution is 1280x720) When I go to "additional drivers" it doesn't show any proprietary drivers, so I manually downloaded them using this command: sudo apt-add-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current but after rebooting, that made the graphics even worse (now stuck as 800x600) SO I tried to configure the drivers with sudo nvidia-xconfig but that simply created an empty xorg.config file. I found one place where a guy gave information to manually input into the xorg.config file but that had no effect at all. Lastly I tried to install previous versions of the NVIDIA drivers, but they wouldn't even fully install. So now I have just re-installed Ubuntu 12.04 and I either need to find a better solution to the first problem (nomodeset) or get the nouveau driver to correctly configure to work with my nvidia graphics. Thanks for your help ahead of time!

    Read the article

  • xrander problem with 1900x1080p

    - by Eslam
    i have a problem with xrandr i successfully executed the following : #cvt 1900 1080 #xrandr 1900x1080 170.75 1904 2024 2224 2544 1080 1083 1093 1120 -hsync +vsync but unfortunately the following command : #xrandr --addmode VGA-0 1900x1080 returned the following error : X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 153 (RANDR) Minor opcode of failed request: 18 (RRAddOutputMode) Serial number of failed request: 29 Current serial number in output stream: 30 the following command output might help in identifying problem : #glxinfo |grep -i opengl OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce 310M/PCIe/SSE2 OpenGL version string: 3.3.0 NVIDIA 310.14 OpenGL shading language version string: 3.30 NVIDIA via Cg compiler OpenGL extensions: #lspci |grep -i vga 01:00.0 VGA compatible controller: NVIDIA Corporation GT218 [GeForce 310M] (rev a2) any ideas what's gonna be wrong ?

    Read the article

  • Is there going to be Twinview ( or alternative) implemented for nouveau ?

    - by lisak
    as I've had heavy issues with nvidia driver regarding performance of basic X window operations (window moving, resizing, scrolling). I switched to nouveau driver. But I lost the possibility of having dual screen that I had previously thanks to nvidia twinview feature... Anyway I rather have fluent X than dual screen, but having dual screen would be nice, so I'm wondering if there is already an nouveau alternative to nvidia's twinview or if it is going to be implemented.

    Read the article

  • 12.04 Unity 3D 80% CPU load with Compiz

    - by user39288
    EDIT : I have been able to to determine that the problem is not compiz, but is actually Xorg. I don't know why, but by quickly maximizing terminal and taking a screenshot with top running before the problem went away I am able to see xorg takes up 72% of cpu, with bamfdaemon taking up 18%, and compiz taking up 14%. Seems the nvidia drivers are to blame, will play more with settings and perhaps do a clean nvidia-current install to try to fix the problem. Having a very annoying problem with high CPU usage. Running 12.04 with latest drivers and nvidia-current installed. Have not had any issues for days, now I have a strange problem. Unity 3d runs great most of the time, 1-2% CPU usage with only transmission running in background. Windows open and close smoothly. However,no matter what programs are open, if I minimize all open programs to the unity bar on the left, my CPU jumps to about 80% and slows down all maximize and minimize effects. Mouse movement stays smooth the whole time, but unity becomes unresponsive for up to 30 seconds at times. Hitting alt + tab to bring up even a single window fixes the problem. The window I bring back up doesn't even have to be maximized to solve the problem. Hitting the super button to bring up the dash makes CPU drop back to idle until I close it, then high CPU usage resumes. Believe the problem is compiz, but even just having only terminal running "top", I have to minimize it to the tray for the problem to show, so I can't see the problem process. I can only tell about the high CPU usage using indicator-sysmonitor. Even tried quitting the indicator, but I can still tell very poor performance with all applications when minimized. Reset compiz back to defaults, tried going to the post-release update nvidia drivers, played with vsync settings in both the nvidia settings and compiz. Even forced refresh rate, but cannot solve the problem. The problem does NOT occur in Unity 2D. Specs are core 2 duo 2.0ghz, 4GB ddr2 ram, 2x 320's HDD in RAID 0, and Nvidia GTX 260M graphics card.

    Read the article

  • What video graphic card would permit ubuntu's standard driver to work well?

    - by Rick
    I installed ubuntu 12.04. All seemed well until I installed the nvidia driver. Then crashola! This situation is untenable. It seems I cannot trust nvidia, and it seems that I cannot rely on ubuntu gurus to test 3rd party drivers. So, apparently some video card manufacturers do not care enough about the linux market to test their drivers, or are there too many 3rd party video cards so that ubuntu folks do not test any 3rd party video drivers? Hence the question: What video graphic card would permit me to use ubuntu's standard driver so that I do not have to rely on nvidia's or any other 3rd-party driver? Perhaps I could then install THAT card and have things work? The ubuntu standard driver actually worked prior to installing the nvidia driver, but not well, and that was because the display flickered, and flickering gives me a headache.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >