Search Results

Search found 14216 results on 569 pages for 'nvidia settings'.

Page 46/569 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • Xorg eating up too much RAM on Ubuntu 9.10 box

    - by Yang
    Xorg is eating up 444MB of 2GB total RAM on my Ubuntu 9.10 x86_64 machine with nvidia drivers installed for the nvidia G86 (GeForce 8300 GS). top shows: top - 18:21:41 up 6 days, 2:40, 9 users, load average: 0.46, 1.12, 1.22 Tasks: 266 total, 3 running, 262 sleeping, 1 stopped, 0 zombie Cpu(s): 8.4%us, 2.0%sy, 0.0%ni, 89.1%id, 0.5%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 2055736k total, 1965136k used, 90600k free, 3952k buffers Swap: 979924k total, 979908k used, 16k free, 102636k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 1432 root 20 0 1154m 442m 7492 S 8 22.0 32:56.97 Xorg 18462 yang 20 0 1001m 219m 8356 S 0 10.9 5:13.25 chrome 24099 yang 20 0 865m 83m 13m S 0 4.2 0:06.91 chrome xrestop shows: xrestop - Display: :0.0 Monitoring 47 clients. XErrors: 0 Pixmaps: 40430K total, Other: 142K total, All: 40573K total res-base Wins GCs Fnts Pxms Misc Pxm mem Other Total PID Identifier 1c00000 21 46 1 19 697 9128K 18K 9146K 3169 x-nautilus-desktop 1000000 4 3 0 17 194 9000K 4K 9004K 3134 gnome-settings-daemon 1600000 51 2 1 25 1100 7648K 28K 7676K ? compiz For comparison, here's my other Ubuntu box, which also has compiz etc. enabled but with ATI RV370 (Radeon X300SE): top - 18:18:18 up 58 days, 4:27, 9 users, load average: 0.00, 0.00, 0.00 Tasks: 224 total, 1 running, 223 sleeping, 0 stopped, 0 zombie Cpu(s): 0.3%us, 0.3%sy, 0.0%ni, 98.8%id, 0.5%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 1024964k total, 987124k used, 37840k free, 247012k buffers Swap: 2048276k total, 94296k used, 1953980k free, 264744k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 24324 yang 20 0 61936 35m 6364 S 0 3.5 4:35.84 nxagent 1768 ntop 20 0 190m 32m 5388 S 1 3.2 283:36.15 ntop 1178 root 20 0 60588 29m 1788 S 0 3.0 5:48.89 console-kit-dae ... 1315 root 20 0 343m 4956 4020 S 0 0.5 3:43.87 Xorg Any ideas on how to get to the bottom of this? (i.e. not "Log out"/"Reboot") Thanks in advance.

    Read the article

  • Enabling hardware acceleration and Xinerama for multi-monitor/multi-GPU in Linux

    - by mynameiscoffey
    My current setup is three monitors connected as follows (monitors listed from left to right): GPU0 (nVidia GTX 280): - Dell 2405FPW (1920x1200) - Dell U2410 (1920x1200) GPU1 (nVidia 210): - Dell 2405FPW (1920x1200) Works like a charm in Windows 7, not so much in Linux. I seem to only have three real options: Run all three monitors as a seperate X screen, I get hardware acceleration but as they are all independent X sessions I cannot move windows between them and can only have firefox open on one at any given time. Run the two on GPU0 in TwinView mode and have GPU1 as a seperate X screen. Same limitation as 1 but at least two monitors work together ok. I did have an issue where occasionally Linux saw both monitors on GPU0 as a single large monitor however. Enable Xinerama and have everything work as I want it to but hardware acceleration is gone and the display is Windows 95 style choppy. My ideal solution would be to have all screens working as they do under Xinerama without the limitation of having hardware acceleration disabled. I don't even care if that means rendering all three on GPU0 and somehow farming out the display of the third monitor to GPU1, whatever works. My question is this: is there any way to accomplish this? I don't feel like my use case is so out there that there shouldn't be at least some form of support (beyond the three limited options presented above), or is my best option going to be to just suck it up and pick up a better card to replace both that can handle three outputs by itself?

    Read the article

  • Gigabit LAN not working on ASUS M2N-MX

    - by chmod
    Today I replace my FastEthernet switch with a newly bought gigabit switch (DGS-1008A). All computers in my house are displaying that the connection speed is 1 Gbps except for one. The computer that is not working is an ASUS M2N-MX which contain an onboard gigabit NIC. See ASUS link for confirmation http://www.asus.com/Motherboards/AMD_AM2/M2NMX/ Here are some info of the machine OS: Windows 7 Ultimate SP1 64bit BIOS version: 1004 (latest) Driver: installed via Windows update (latest from Windows update) Windows Update: fully updated The machine is reformatted 3 days ago, so it's pretty clean, no junk, no virus, etc Cable: Amp CAT5E 5 meters In device manager, the name of the NIC is "NVIDIA nForce 10/100/1000 Mbps Ethernet" What I have try: I did try to install the driver provided in ASUS website, but there isn't any for Windows 7 64 or Vista 64. I did try to install the latest nForce340/6100, downloaded from Nvidia website. However, the LAN driver refuse to install, it complain that I already have the best driver installed. I looking in the property -- advance tab -- Speed/duplex settings, in an attempt to force it to run at 1000Mbps, but there is no 1000Mbps choice, only 10 and 100Mpbs. I change the CAT5E cable (use one from another computer that is running gigabit without problem) Anyone have this issue or know how to solve it? Thanks

    Read the article

  • Screen occasionally flashes black when under load, sometimes does not recover

    - by Oak
    I've built a brand new machine, but to my horror my monitor occasionally flashes black for around a second, then returning to normal. This happens under load (watching videos / playing games) but only sometimes; e.g. it doesn't occur in "Batman: Arkham City" but does in "XCOM: Enemy Unknown". When watching videos, it also occurs when not watching them full-screen, and it sometimes even occurs when the machine isn't doing anything, just sitting at the desktop and moving the mouse around. Has anyone ran into this problem and knows of any solution? Additionally, sometimes after the black screen, it won't return to normal, instead turning completely corrupt. In these cases even quitting the application doesn't help, but physically disconnecting and reconnecting the monitor fixes the problem. This problem did not occur on my earlier machine which used the same physical monitor. Additional details: Windows Server 2012, configured as Windows 8, with latest updates installed NVIDIA GeForce GTX 660 Ti, with latest driver installed Ample amounts of CPU and RAM for playing the above games and for watching videos. I've read about similar problems elsewhere but could not find a working solution: http://www.youtube.com/watch?v=Zt00C-HXFbA&noredirect=1 http://www.sevenforums.com/hardware-devices/59126-monitor-flashing-black.html https://eu.battle.net/d3/en/forum/topic/4079098908?page=4 http://www.tomshardware.com/forum/347422-33-screen-flickering-black-nvidia-driver-update

    Read the article

  • configure a Macbook Pro to use external monitor at boot (Debian Linux)

    - by Eric
    In the spirit of reuse, I've installed Debian (version 6.0.5 "squeeze") on my wife’s old Macbook Pro (circa 2009 or so), to repurpose it for various tasks. The catch is the display is flaky. It will last a random amount of time, between 2 minutes and 2 hours, before freezing and graying out. This is a known issue with that generation of MBP. Fortunately it’s no problem for me, as I plan to use it with an external monitor anyway. Which brings us to the problem: How do I configure this thing to output to the external display by default, and hopefully disable the built-in LCD? The ideal solution would be to modify a setting in the EFI (BIOS), but I’m not holding out much hope for that. Next best thing would be a kernel option I can pass to the NVIDIA driver. What won’t work is a solution that doesn’t give me a display until X starts. I need to have console access, especially given that the built-in LCD is dying, and any day now might give out completely. So far I haven’t been able to find anything online. lspci says I’ve got an NVIDIA GeForce 9400M Help is much appreciated! Eric PS if this question is better suited to the Unix & Linux area, pls advise and I will move it.

    Read the article

  • Monitor flickers in native resolution.

    - by ptikobj
    With my new Samsung Syncmaster BX2450 I have the following problem: In Windows XP (SP2), all resolutions above 1440x900 have either strange pixel errors or an extreme flickering. It seems, that the effect worsens for higher resolutions. In special, I would like to run the monitor with its native resolution (1920x1080), however I can't watch longer than 5 seconds on the monitor because of the flickering... My Graphics Card is a Geforce FX 5200 with the most up-to-date driver (according to Nvidia.com: Forceware 175.19) and I'm having the monitor connected to its DVI-output. The strange thing is, under Ubuntu 10.04, all resolutions work just perfect, so the display must be alright. edit: seems to be a driver problem... if I use the proprietary NVIDIA drivers in Ubuntu, I have the same problem as in Windows. I would like to reformulate my question: Is there a modified/alternative Geforce FX 5200 driver (as there is in Ubuntu) for Windows that allows me to use 1920x1020 without problems ? I already tried the omega drivers: unfortunately, it still looks poor on the native resolution.

    Read the article

  • computer randomly restarting. both in game and out of game

    - by eric
    first my specs are. AMD Phenom II x4 955 processor 3.2ghz 20gb ddr3 ram 4Gb Nvidia Geforce GTX 770 850w Corsair tx850w psu Gigabyte ud3 mobo Windows 7 professional I recently uprgraded my vid card to gtx770 and upgraded my psu to the 850w thats in it now. i did a reformat with the installation of the new gpu and psu and started fresh and only have a couple programs installed (diablo3, nvidia control panel, wow, and steam). all drivers are up to date and everything is hooked up correctly. the problem is it will randomly shut down. no blue screen. just turns itself straight off and reboots after a couple seconds. occasionally i will have to unplug the power cable from the psu for a few minutes then reconnect and it will start up. it seems pretty random. sometimes it does it when my pc is just sitting there on the home screen. and sometimes it does it during games. and sometimes it doesnt do it for days at a time. i noticed the psu felt hot so i put an extra fan blowing straight onto both the psu and gpu and neither feel overly hot after it shuts down now. could it just be that it is a psu problem. the psu was taken from another machine but wasnt having this problem in that machine. i have seen a few articles online about gtx770 doing the same thing. but i havent found any answers or solutions. any help will be appreciated. im sure the 850w is enough to power my machine, im just stumped and ran out of ideas to fix it. i have even returned the video card for another thinking it might have been an issue with that particular card, but still gettin the same problem.

    Read the article

  • Hyper-V Blue Screens with Nvidia GeForce 8400 GS Graphics Card

    - by Mahmoud Saleh
    I am using Windows Server 2008 R2 Enterprise x64. After installing the Hyper-V role and restarting the machine, I get a blue screen error and an immediate reboot. I have Googled the issue and tracked it down to the graphics card, so I uninstalled it, and then Windows loads fine. However, after installing the graphics driver again, the Blue Screen returns. The graphics card is an Nvidia GeForce 8400 GS. Does anyone know how I can resolve this issue?

    Read the article

  • Updated Security Baseline (7u45) impacts Java 7u40 and before with High Security settings

    - by costlow
    The Java Security Baseline has been increased from 7u25 to 7u45.  For versions of Java below 7u45, this means unsigned Java applets or Java applets that depend on Javascript LiveConnect calls will be blocked when using the High Security setting in the Java Control Panel. This issue only affects Applets and Web Start applications. It does not affect other types of Java applications. The Short Answer Users upgrading to Java 7 update 45 will automatically fix this and is strongly recommended. The More Detailed Answer There are two items involved as described on the deployment flowchart: The Security Baseline – a dynamically updated attribute that checks to see which Java version contains the most recent security patches. The Security Slider – the user-controlled setting of when to prompt/run/block applets. The Security Baseline Java clients periodically check in to understand what version contains the most recent security patches. Versions are released in-between that contain bug fixes. For example: 7u25 (July 2013) was the previous secure baseline. 7u40 contained bug fixes. Because this did not contain security patches, users were not required to upgrade and were welcome to remain on 7u25. When 7u45 was released (October, 2013), this critical patch update contained security patches and raised the secure baseline. Users are required to upgrade from earlier versions. For users that are not regularly connected to the internet, there is a built in Expiration Date. Because of the pre-established quarterly critical patch updates, we are able to determine an approximate date of the next version. A critical patch released in July will have its successor released, at latest, in July + 3 months: October. The Security Slider The security slider is located within the Java control panel and determines which Applets & Web Start applications will prompt, which will run, and which will be blocked. One of the questions used to determine prompt/run/block is, “At or Above the Security Baseline.” The Combination JavaScript calls made from LiveConnect do not reside within signed JAR files, so they are considered to be unsigned code. This is correct within networked systems even if the domain uses HTTPS because signed JAR files represent signed "data at rest" whereas TLS (often called SSL) literally stands for "Transport Level Security" and secures the communication channel, not the contents/code within the channel. The resulting flow of users who click "update later" is: Is the browser plug-in registered and allowed to run? Yes. Does a rule exist for this RIA? No rules apply. Does the RIA have a valid signature? Yes and not revoked. Which security prompt is needed? JRE is below the baseline. This is because 7u45 is the baseline and the user, clicked "upgrade later." Under the default High setting, Unsigned code is set to "Don’t Run" so users see: Additional Notes End Users can control their own security slider within the control panel. System Administrators can customize the security slider during automated installations. As a reminder, in the future, Java 7u51 (January 2014) will block unsigned and self-signed Applets & Web Start applications by default.

    Read the article

  • ASP 3.0 Folder/File Permissions Settings (ASP Classic)

    - by ASP Pee-Wee
    Dear Stack Exchange, Hi, I have built a form input page in HTML that has an action to post to an ASP handler/processor .asp file. The form handler/processor .asp file contains only <% Insert VBScript Here % and no HTML output whatsoever. The .asp file was never intended to be a "web viewable" .asp file like an .asp home page file or html file would. It's supposed to be for my eyes only- not the public's however it does need to take info posted by the public and do something with it on it's end. I have used VBScript/ASP3.0 to build the form handler/processor file and would like to know how to keep someone from viewing the actual VBScript in the handler/processor .asp file. I am aware of obfuscation but I would like to know how to keep prying eyes from even being able to take a look at the obfuscated code in the handler/processor file. I realize that the server executes the .asp file first before outputting anything to the browser so I guess that my main concern is mostly that someone may could "download" the form handler/processor .asp file, then view it's contents on their machine. Assuming the form handler .asp file is where it is, behind the root, and is on a windows server (no htaccess approach) how could one protect it so that it could never be viewed or simply pulled down via anonymous ftp or something like that? Is there something like "script only" permissions that the system administrator could set up for a particular folder? Remember, with shared hosting I can't go above the root. If so, would the form still be able to post? How would any of you guys go about protecting the asp file in addition to obfuscation? Any help would be greatly appreciated. Thanks, ASP Pee-Wee

    Read the article

  • How to set all locale settings in Ubuntu

    - by Christian Schneider
    A remote installed application has some encoding problems and on my local machine it is running fine. What is the best way to "copy" my locales to the remote machine? The locales on my personal machine are configured like this: $ locale LANG=de_DE.UTF-8 LANGUAGE=de_DE:en LC_CTYPE="de_DE.UTF-8" LC_NUMERIC=en_US.UTF-8 LC_TIME=en_US.UTF-8 LC_COLLATE="de_DE.UTF-8" LC_MONETARY=en_US.UTF-8 LC_MESSAGES="de_DE.UTF-8" LC_PAPER=en_US.UTF-8 LC_NAME=en_US.UTF-8 LC_ADDRESS=en_US.UTF-8 LC_TELEPHONE=en_US.UTF-8 LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=en_US.UTF-8 LC_ALL=

    Read the article

  • Two 12.04 machines have the same display settings but different results

    - by durron597
    I have one machine that has a relatively fresh install of 12.04 and one that I inherited. The terminal window in the inherited machine has a really weird font, and the regular one is what I would expect. Especially the behavior of the "m" character is messed up. Note: both of these machines are on the same KVM switch. Here is what I've tried: MyUnity on both machines seem the same .bashrc on both machines seem similar in all the ways that would matter for this issue The terminal profiles on both machines are the default Here are the xrandr outputs: Good xrandr: Screen 0: minimum 320 x 200, current 1280 x 1024, maximum 4096 x 4096 VGA1 connected 1280x1024+0+0 (normal left inverted right x axis y axis) 376mm x 301mm 1280x1024 60.0 + 76.0 75.0* 72.0 70.0 1152x864 75.0 1024x768 75.1 70.1 60.0 832x624 74.6 800x600 72.2 75.0 60.3 640x480 72.8 75.0 66.7 60.0 720x400 70.1 Bad xrandr: Screen 0: minimum 8 x 8, current 1280 x 1024, maximum 8192 x 8192 DP-0 disconnected (normal left inverted right x axis y axis) DP-1 disconnected (normal left inverted right x axis y axis) DP-2 disconnected (normal left inverted right x axis y axis) DP-3 connected 1280x1024+0+0 (normal left inverted right x axis y axis) 376mm x 301mm 1280x1024 60.0*+ 76.0 75.0 72.0 70.0 1152x864 75.0 1024x768 75.0 70.1 60.0 800x600 75.0 72.2 60.3 640x480 75.0 72.8 59.9 Finally here are screenshots of both machines, it seems to really only be Terminal, I have askubuntu behind the terminal window for comparison: Good screenshot: Bad Screenshot: Any thoughts as to what this might be?

    Read the article

  • java slick2D - problem using ScalableGame class

    - by nellykvist
    I have problem adjusting the size of the screen, using the ScalableGame class from Slick2D library. So, what I want to achieve, whenever I change display size, background should adjust to screen size, and objects (images, grahpic shapes) should fit (scale). Alright, so this is how state looks by default. I can change screen size, but images and graphic shapes does not appGameContainer = new AppGameContainer(     new ScalableGame(new AppStateController(), Settings.video.getWidth(), Settings.video.getHeight(), true) ); appGameContainer.setDisplayMode(Settings.video.getWidth(), Settings.video.getHeight(), Settings.video.isFullScreen()); appGameContainer.start(); If I assign to width/height +100, ScalableGame constructor: appGameContainer = new AppGameContainer(     new ScalableGame(new AppStateController(), Settings.video.getWidth() + 100, Settings.video.getHeight() + 100, true) ); appGameContainer.setDisplayMode(Settings.video.getWidth(), Settings.video.getHeight(), Settings.video.isFullScreen()); appGameContainer.start(); If I assign to width/height +100, to display: appGameContainer = new AppGameContainer(     new ScalableGame(new AppStateController(), Settings.video.getWidth(), Settings.video.getHeight(), true) ); appGameContainer.setDisplayMode(Settings.video.getWidth() + 100, Settings.video.getHeight() + 100, Settings.video.isFullScreen()); appGameContainer.start();

    Read the article

  • Auto-detect proxy settings on network

    - by Ali Lown
    I am having problems trying to run web browser software on the local network through the proxy. When running off the profile drive which is on a network share, the system is unable to auto-detect proxy settings. When running off the local C drive, the browsers are able to correctly autodetect the settings. The error from the browser is about it being unable to fetch the proxy configuration file. Is this some form of authentication preventing it retreiving the settings when running of the network location? PS. Would this be better off on superuser?

    Read the article

  • How to change System default language on GNOME3?

    - by Vor
    I just installed GNOME3 on my Ubuntu. Everything worked fine till I restarted computer. Then I received a message if I want to change folders name to some other (different language, don't know what is this, but looks like Chinese). I pressed, 'keep old names' but it still changed all my folder names! And also the rest of the names. (like settings, and all that staff). So if you can give me the direction on where to click (cause all English names changed to non-English) and I simply don't know what does any of them means!

    Read the article

  • Where default settings are stored after applying GPO?

    - by tester5566
    When I apply a GPO that changes Service startup settings, where the default service startup settings are kept? And how can I read and modify them? The reason of the question is that I have a hundred of servers where most of services are disabled by a baseline GPO for hardening purposes. I want to relax this GPO by removing some services but I do not want that the service startup settings becomes default ones after the GPO is relaxed. So I want to keep the actual hardened state as a default state but allow local admins to change it if necessary. Thank you

    Read the article

  • Webmaster Tools - URL Parameters Settings Do Not Work

    - by David
    Google Webmaster Tools shows problems with duplicate title tags under Optimization - HTML Improvements, for example: ???????? Mitsubishi Electric Mr. Slim PC Series PC-3KAKLT (220V) 30000 BTU > /????-????/mitsubishi-mr-slim-pc3kaklt-30000-btu.html > /????-????/mitsubishi-mr-slim-pc3kaklt-30000-btu.html?category_id=96 These two pages have exactly the same content, a rel-canonical tag is set, and they are (no longer) linked to internally. Additionally, we used the Configuration - URL Parameters setting, to set this parameter to No: Doesn't affect page content about one month ago. However, Google is still showing these HTML improvements (and rankings dropped dramatically). What else can we do here? Best, David

    Read the article

  • How to get current connection settings

    - by Peter Larsson
    SELECT name AS Setting,         CASE             WHEN @@OPTIONS & number = number THEN 'ON'             ELSE 'OFF'         END AS Value FROM    master..spt_values WHERE   type= 'SOP'         AND number > 0

    Read the article

  • Windows Remote-App Server 2012 Office 2013 User Settings not saved

    - by dave
    I have a Windows Server 2012 with RemoteApps enabled. It's running the latest Patches etc. It has Office 2013 installed and Excel and Word are shared to all users. Now I got the Problem that after each Reboot all User Settings are lost. I have a few users who pin previously opened Documents so they dont need to remember all Paths and those are all gone after Reboot. Also last opened Documents is empty and after a Server reboot it brings the office 2013 Window for First time setup where it asks if you want to connect to skydrive and all that. In the RemoteApps Collection I enabled a Userprofile-Drive 100GB drive E: for Storing User profile data. There is a Domain of course and there is no GPO Preventing the user from Storing settings etc. We also got an older Terminal Server 2003 in the same Domain where this is not happening. Any ideas why this is happening that all the Settings are lost after Reboot?

    Read the article

  • Cant get lm-sensors to load ATI Radeon temp and fan or output all settings

    - by woody
    New to Linux and having minor issues :/ . I followed this guide initially but did not recieve the proper output and did not show my ATI Radeon HD 5000 temp or fan speed. Then used this guide, same problems exhibited. No issues installing and no errors. I think its not reading i2c for some reason. The proprietary driver is installed and functioning correctly according fglrxinfo. I can use aticonfig commands and view both temp and fan. Any ideas on how to get the ATI Radeon sensors working under 'sensors'? When i run 'sudo sensors-detect' this is my ouput # sensors-detect revision 5984 (2011-07-10 21:22:53 +0200) # System: LENOVO IdeaPad Y560 (laptop) # Board: Lenovo KL3 This program will help you determine which kernel modules you need to load to use lm_sensors most effectively. It is generally safe and recommended to accept the default answers to all questions, unless you know what you're doing. Some south bridges, CPUs or memory controllers contain embedded sensors. Do you want to scan for them? This is totally safe. (YES/no): y Silicon Integrated Systems SIS5595... No VIA VT82C686 Integrated Sensors... No VIA VT8231 Integrated Sensors... No AMD K8 thermal sensors... No AMD Family 10h thermal sensors... No AMD Family 11h thermal sensors... No AMD Family 12h and 14h thermal sensors... No AMD Family 15h thermal sensors... No AMD Family 15h power sensors... No Intel digital thermal sensor... Success! (driver `coretemp') Intel AMB FB-DIMM thermal sensor... No VIA C7 thermal sensor... No VIA Nano thermal sensor... No Some Super I/O chips contain embedded sensors. We have to write to standard I/O ports to probe them. This is usually safe. Do you want to scan for Super I/O sensors? (YES/no): y Probing for Super-I/O at 0x2e/0x2f Trying family `National Semiconductor/ITE'... Yes Found unknown chip with ID 0x8502 Probing for Super-I/O at 0x4e/0x4f Trying family `National Semiconductor/ITE'... No Trying family `SMSC'... No Trying family `VIA/Winbond/Nuvoton/Fintek'... No Trying family `ITE'... No Some hardware monitoring chips are accessible through the ISA I/O ports. We have to write to arbitrary I/O ports to probe them. This is usually safe though. Yes, you do have ISA I/O ports even if you do not have any ISA slots! Do you want to scan the ISA I/O ports? (YES/no): y Probing for `National Semiconductor LM78' at 0x290... No Probing for `National Semiconductor LM79' at 0x290... No Probing for `Winbond W83781D' at 0x290... No Probing for `Winbond W83782D' at 0x290... No Lastly, we can probe the I2C/SMBus adapters for connected hardware monitoring devices. This is the most risky part, and while it works reasonably well on most systems, it has been reported to cause trouble on some systems. Do you want to probe the I2C/SMBus adapters now? (YES/no): y Using driver `i2c-i801' for device 0000:00:1f.3: Intel 3400/5 Series (PCH) Now follows a summary of the probes I have just done. Just press ENTER to continue: Driver `coretemp': * Chip `Intel digital thermal sensor' (confidence: 9) To load everything that is needed, add this to /etc/modules: #----cut here---- # Chip drivers coretemp #----cut here---- If you have some drivers built into your kernel, the list above will contain too many modules. Skip the appropriate ones! Do you want to add these lines automatically to /etc/modules? (yes/NO) My output for 'sensors' is: acpitz-virtual-0 Adapter: Virtual device temp1: +58.0°C (crit = +100.0°C) coretemp-isa-0000 Adapter: ISA adapter Core 0: +56.0°C (high = +84.0°C, crit = +100.0°C) Core 1: +57.0°C (high = +84.0°C, crit = +100.0°C) Core 2: +58.0°C (high = +84.0°C, crit = +100.0°C) Core 3: +57.0°C (high = +84.0°C, crit = +100.0°C) and my '/etc/modules' is: # /etc/modules: kernel modules to load at boot time. # # This file contains the names of kernel modules that should be loaded # at boot time, one per line. Lines beginning with "#" are ignored. lp rtc # Generated by sensors-detect on Fri Nov 30 23:24:31 2012 # Chip drivers coretemp

    Read the article

  • Trouble switching to Dvorak on Ubuntu 12.04

    - by Dodgie
    I've decided to switch to Dvorak on my Ubuntu machine, but I'm having some trouble: First, I attempted to do this through the GUI -- System Settings - Keyboard Layout - add layout (plus sign) - English(programmer Dvorak). This didn't work at first, so I restarted my machine. It seemed to work at the password prompt (if only because QWERTY did not), but I couldn't get it to accept my password. I used the virtual keyboard option to enter my password with mouse clicks (the virtual keyboard was using the programmer's Dvorak Standard) and was able to get in that way. Once logged in, however, I was back to QWERTY. Second, I tried to switch on the command prompt -- $ loadkeys /usr/lib/kbd/keytables/dvorak.map The error message I received was "Couldn't get a file descriptor referring to the console " Does anyone know what I'm doing wrong? I've looked for a solution for these problems, but couldn't find anything.

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >