Search Results

Search found 9137 results on 366 pages for 'pc bsd'.

Page 283/366 | < Previous Page | 279 280 281 282 283 284 285 286 287 288 289 290  | Next Page >

  • make gnome3 to work as I expect! [closed]

    - by gnome
    Possible Duplicate: How to revert to GNOME Classic? I don't like unity so I tried Ubuntu Gnome Remix 12.10. But I feel disappointed too. - No right click on desktop. (I know there is a way to enable it). - No icon on desktop. - I want to have "applications", "palces", "systems" on top panel as before. It just takes much more steps to arrive where I want to go in default gnome 3 settings. - It seems to me that for some applications, when it is maximalized, there is no close buttons or "restore to previous size" button. In fact, it has no title bar in this case, so I can't drag the title bar to restore its previous size and move the window. So I got 3 questions here: (for those who like the way gnome work before) What difference between gnome2 and gnome3 that you prefer the way it works in gnome2? How do you change it to work in the gnome2 way? I don't like the appearance in gnome3. For those who have tried to make your gnome3 looks good, could you share how you customize your gnome3 appearance? (this is for chatting only, not a real question) Why all the main OS (Windows, Mac, Some Linux) want to integrate Desktop and Tablet with focus on Tablet? (Which means that when it is installed on a desktop, it still has tablet appearance, which makes things strange.) Do they think the personal desktop PC will become less and less?

    Read the article

  • Directx vs XNA - Which is better for me? [closed]

    - by tristo
    Recently I got Visual Studio 2012 from visual studio 2010, although did not expect Visual Studio to 2012 to designed the way it was. Anyway I am pleased with some of VS 2012 technology and have moved all of my projects to it. At this point of time since I got VS 2012 I have been into making windows applications and other non-game activities. ALTHOUGH have recently gotten into the spirit of game development and I am planning to make a 3d comical game, shader effects, not too complicated meshes, but it requires alot of lighting effects to emphasise certain parts of the game. When I was using VS 2010 I had a great time making 2d games with XNA, it uses a great language, and has a very awesome system. But I no longer have XNA with me, and the workarounds described in stackoverflow always gives me errors while using xna. Anyway it seems that microsoft have stuffed themselves up with xna anyway with the weirdness of Windows 8, and it being only avaliabe on pc and xbox. Due to these reasons I have decided to work with Directx and Direct3d to produce my new game, although the overflowing credits after each directx game gives me the shivers, and the low-level coding of directx also puts me on thin ice with my games, left in a confusional mess with what decision I should make. I don't know anything about directx or direct3d. I am an indie developer, but I am planning to take on alot of professional aspects of games. I don't have heaps of time(2-3 hours a day) I don't mind the complexity of how directx works, as long as I can learn how to make the fundementals of a game in a week. I am also unsure if directx is really for my situation, and keep with xna game development. Anyone can tell me the best technology for me would be great.

    Read the article

  • What are the advantages to use vector-based fonts over bitmap fonts in (2d) games?

    - by jmp97
    I know that many games are using bitmap fonts. Which are the advantages for vector-based font rendering / manipulation when compared to bitmap fonts and in which scenarios would they matter the most? Prefer a focus on 2d games when answering this question. If relevant, please include examples for games using either approach. Some factors you might consider: amount of text used in the game scaling of text overlaying glyphs and anti-aliasing general rendering quality font colors and styling user interface requirements localisation / unicode text wrapping and formatting cross-platform deployment 2d vs 3d Background: I am developing a simple falling blocks game in 2d, targeted for pc. I would like to add text labels for level, score, and menu buttons. I am using SFML which uses FreeType internally, so vector-based features are easily available for my project. In my view, font sizes in simple games often don't vary, and bitmap fonts should be easier for cross-platform concerns (font-formats and font rendering quality). But I am unsure if I am missing some important points here, especially since I want to polish the looks of the final game.

    Read the article

  • ubuntu live cd start up error

    - by Emiel
    First off, I'm new to the Linux scene. This is my first attempt to make a single boot installation for Ubuntu. I tried it for a few days in dual boot with win7 and I was sold, so i removed the tumor my pc had to endure for so long (sorry laptop) and installed Ubuntu from an usb boot device. My dual boot was as follows: Windows 7 was installed on partition C from hdd1, the windows installer for Ubuntu installed Ubuntu on partition I on that same hdd, hdd1. In the live cd installation I did the normal execution for removing windows and it said that after the installation my partition would be 320gb big, that is the total size of my hdd, so I automatically assumed that it would format my whole hdd. Now the installation has completed and it tells me to restart my system, and here comes the problem: now I get a dashing white cursor on my screen after the BIOS load and it won't budge... it just stands there and it doesn't move on or load Ubuntu, the system gets very hot at this point... Then I tried to reinstall using the same live CD, it is still on my USB drive, but when I boot from the USB, I get the error: no such file with some address and the a grub rescue. What to do? I can get hold of a win7 copy, but I don't really want to use that crap again... Thanks for helping me out. Kind regards, Emiel

    Read the article

  • JavaOne 2012 in Review

    - by Janice J. Heiss
    Noted freelance writer Steve Meloan has a new article up on otn/java, titled, “JavaOne 2012 Review: Make the Future Java” in which he summarizes the happenings at JavaOne 2012. Along the way, he reminds us that if the future turns out to be anything like the past, Java will do fine: The repeated theme for this year's conference was ‘Make the Future Java,’ and according to recent stats, the groundwork is already firmly in place:    There are 9 million Java developers worldwide.    Three billion devices run Java.    Five billion Java Cards are in use.    One hundred percent of Blu-ray Disc players ship with Java.    Ninety-seven percent of enterprise desktops run Java.    Eighty-nine percent of PC desktops run Java.This year's content curriculum program was organized under seven technical tracks:    Core Java Platform    Development Tools and Techniques    Emerging Languages on the JVM    Enterprise Service Architectures and the Cloud    Java EE Web Profile and Platform Technologies    Java ME, Java Card, Embedded, and Devices    JavaFX and Rich User Experiences”Meloan artfully reminds us of how JavaOne makes learning fun. Have a look at the article here.

    Read the article

  • Client/Server game even in solo: any big problem?

    - by Klaim
    I'm making a game which have strong basic design based on multiplayer but also should provide a really interesting and self-sufficient solo game. A bit like a real-time strategy game. The events and actions taken shouldn't be as massive and immediate as in a FPS, so you can also think the networking like for an RTS. It's a PC game, targetting Windows, MacOSX and Linux (Ubuntu & Fedora). It's programmed in C++, using a variety of open source libraries, so I have great (potential) control over the performances. So far I always considered that just making the game work with two applications, client & server, even in solo mode was ok. However, as I'm in the process of starting the network code I'm having doubts about if it's a good idea. I'm not a specialist so I might be missing something in my analysis. I see these pros and cons: Pros: The game works only one way so if I fix a bug it should apply on all game modes, whatever the distance with the server is; Basic networking issues would be detected early, including behaviour with the protection softwares (firewall) installed (i am not specialist so this might be wrong); Cons: I suppose that even if it should be really fast enough, networking client and server on the same computer would still be slower than no networking and message passing in (one) process memory. Maybe debugging would be more difficult? I don't have experience in this case but so far I assume that debugging with Visual Studio allows me to debug multiple process so it shouldn't be really different. Also, remote debugging. My question is: is there a big disadvantage that I missed? Or maybe there are advantages that I missed and that should encourage me to just continue with only client-server game sessions?

    Read the article

  • Shader compile log depending on hardware

    - by dreta
    I'm done with the core of my graphics engine and I'm testing it on every platform I can get my hands on. Now, what I noticed is that different drivers return different shader and program compile log content. For example, on my friend's laptop if you successfuly compile a shader then the log is simply empty. However on my PC I get some useful information along with it. So if I compile a vertex shader, I'll get: Vertex shader was successfully compiled to run on hardware. Which isn't that impressive, but is what happens when I compile a program. On my friend's computer the log is empty, since the program compiles. However on my own computer I get: Vertex shader(s) linked, fragment shader(s) linked. Which is awesome, because I'm attaching a geometry shader with 0 (I have a geometry shader file with trash, so it doesn't compile and the pointer is set to 0), and the compiler just tells me which shaders linked. Now it got me thinking, if I was going to buy a graphics card, is there a way for me to get the information about whether or not I'll get this "extended" compile information? Maybe it's vendor specific? Now I don't expect an answer TBH, this seems a bit obscure, but maybe somebody has any experience with this and could post it.

    Read the article

  • How can I disable the purple bootloader splash at boot?

    - by wim
    This question has been answered before, but neither of the methods in the accepted answer worked for me on 11.10. First I tried editing in /etc/default/grub, and then running sudo update-grub. But after that I still got a blank, plain, purple screen while the kernel is loading. The screen has no boot options, and it obscures those dmesg that I want to see going in the terminal. Next I tried removing the plymouth-theme-*, but that just broke my gnome-shell theme looks, and the purple screen still remains. I have also tried configuring it with startupmanager package, but nothing seems to get rid of that darned purple splash. here are the contents of my /etc/default/grub file: # If you change this file, run 'update-grub' afterwards to update # /boot/grub/grub.cfg. # For full documentation of the options in this file, see: # info -f grub -n 'Simple configuration' GRUB_DEFAULT="0" GRUB_HIDDEN_TIMEOUT="0" GRUB_HIDDEN_TIMEOUT_QUIET="true" GRUB_TIMEOUT="0" GRUB_DISTRIBUTOR="`lsb_release -i -s 2> /dev/null || echo Debian`" #GRUB_CMDLINE_LINUX_DEFAULT="" GRUB_CMDLINE_LINUX_DEFAULT="" GRUB_CMDLINE_LINUX="" # Uncomment to enable BadRAM filtering, modify to suit your needs # This works with Linux (no patch required) and with any kernel that obtains # the memory map information from GRUB (GNU Mach, kernel of FreeBSD ...) #GRUB_BADRAM="0x01234567,0xfefefefe,0x89abcdef,0xefefefef" # Uncomment to disable graphical terminal (grub-pc only) # GRUB_TERMINAL="console" # The resolution used on graphical terminal # note that you can use only modes which your graphic card supports via VBE # you can see them in real GRUB with the command `vbeinfo' #GRUB_GFXMODE="640x480" # Uncomment if you don't want GRUB to pass "root=UUID=xxx" parameter to Linux #GRUB_DISABLE_LINUX_UUID="true" # Uncomment to disable generation of recovery mode menu entries #GRUB_DISABLE_RECOVERY="true" # Uncomment to get a beep at grub start #GRUB_INIT_TUNE="480 440 1" GRUB_DISABLE_OS_PROBER="true"

    Read the article

  • Ubuntu live cd : black screen and blinking cursor

    - by IFasel
    I try to install ubuntu 12.04 on my computer. I can get to the purple screen on the live cd but then, if I choose "Installing Ubuntu", I have a black screen with a cursor blinking (and nothing else happens). My PC : acer aspire M3920, CPU i5-2300, 8 Gb RAM, NVIDIA gt 405. What I already tried : I tried with 12.04 and 13.04 daily build I tried with a live usb and with a live dvd I tried the following boot options : nomodset, acpi=off I googled a lot and it seems that it could be a graphic card problem. Do you know any other boot options that I could try ? UPDATE This is not a duplicate : I've tried all the common boot options (nomodeset, noacpi...) and it doesn't change anything. With the option "no splash" (instead of "quiet splash"), I can see what happens before the forever-blinking cursor : [sdg] no caching mode present [sdg] assuming drive cache : write trough ata8.00: excetion Emask 0x52 ... frozen ata8 : SError : { RecovData RecovComm UnrecovData...} ata8.00 : failed command : IDENTIFY PACKET DEVICE ... ata8.00 : status : { DRDY } ata8 : hard resetting link Does somebody know what it means ? N.B. astonishingly, Puppy Linux boots fine (but Debian, Fedora and Ubuntu do not) Solution In fact, it was not a graphic card problem. I had to disconnect the dvd drive and connect it to another free sata connector (I don't really understand why Ubuntu had trouble with this connector and Windows 7 not). After that, everything worked fine.

    Read the article

  • nVidia 9800 GTX+ X11 fails to initialize. no unity or lightdm

    - by rlemon
    I have just upgraded my work pc to 12.04 (not upgrade, fresh install), installing updates during the install, and after everything has loaded (with no errors) and I restart I get brought to console 1 tty login. Console 7 looks like this: IIRC I did not have to finagle with my drivers on 11.10 to get this card working. If this is in fact a driver bug I will remove this post and submit the bug but i'm not 100% confident that it is. I attempted to run unity --reset and got this: Lastly I tried $ sudo apt-get install nvidia-current which tells me nvidia-current is already the newest version. so I ran $ sudo dpkg-reconfigure nvidia-current which says /usr/sbin/dpkg-reconfigure: nvidia-current is broken or not fully installed. Anything I can try from here would be awesome. Currently the only way to get the system up and running was to shut down, plug one of my monitors into the onboard video, enable the onboard video card from the BIOS, then boot back up (and on my single monitor everything is fine). update So I have been able to boot fresh with the ext card plugged in as long as I don't take the updates with the install. past this if I only install the nvidia drivers (nvidia-current or nvidia-current-updates) from the main server (or canadian) I then get the problems.. My proposal; which I don't know where to look for: Can I try installing the previous version of this driver? In the past, on another machine I had issues with my NIC driver being funky... downgraded to the previous driver and bam everything was merry and well.

    Read the article

  • ATI HD5450 w/ Ubuntu 14?

    - by Oliwb
    So, I'm running Ubuntu 12.04 right now. Last night I realised that I'm way behind as we're up to 14! Decided to run the updater and figured I'd take the path of least resistance (but lengthy choice) of going 12.04 - 12.10 - 13.xx - 14.xx. So I download the first packet and then get an error message about my graphics card perhaps not working in 12.10. Now part of the reason I was looking to upgrade is because I get (and have always had) this strange occasional flickering - now that I have two screens it's just on the second monitor.....oddly this is not the same port that was giving issues before). The graphics card is an ATI Radeon HD5450 and I have the Catalyst (I think it's 13 or 14) driver installed - last night. It could be that the graphics card has never worked properly...I bought the PC new and with an "upgraded" video card and it's always suffered with this flicker. I just figured that the drivers weren't right or something. So I have 3 questions: 1) is my video card broken or is the driver letting it down and causing the flicker? 2) will it be able to handle the upgrade to 14 via 13? Or should I cut my losses and get something newer? 3) if I should get something newer....what should I get ( Thanks in advance....

    Read the article

  • Strange traffic on fresh Ubuntu Server install

    - by Fishy
    I've just installed Ubuntu Server on my home box after becoming partially familiar with it at work and wanting to train up as a Pen Tester. I installed the latest version on a logical partition (the main one contained Win7), and selected none of the extra modules (I think). I installed ngrep and fired it up (along with TCPdump) and immediately saw some strange traffic which I am unable to identify. My pc is sending out UDP packets every couple of seconds to a seemingly random series of IP addresses, all on the same port (47669 - though I did also see it use another port for a while). I watched it do this for about 20 mins, whilst trying to work out why it was doing it. The only other traffic was the odd ARP request for the router and SSDP UPnP broadcasts from the router. Anyone know what this is, or have any advice on how best to find out? Thanks. EDIT: Actually, it's not my box generating the traffic. It's receiving the traffic on that port, from a series of IP addresses, and returning 'port unreachable' messages.

    Read the article

  • How To Sync Your Shared Google Calendars with Your iPhone

    - by Justin Garrison
      Smartphones are essential to our daily lives. They help us stay connected and keep us organized. But when it comes to calendar syncing and Gmail there are limitations. Here’s how you can sync your shared calendars and contacts from Gmail. If you use Gmail you probably know about the ability to create and share calendars with others. They help keep groups organized and even let you subscribe to public events. When it comes to getting that information on your smartphone there are some trade offs if you are on a non-Android phone. Android phones will sync your email, contacts, and all of your calendars by just singing into your Gmail account. If you have an iPhone however, you will miss out on contact syncing if you set up your account as a Gmail account. HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive

    Read the article

  • Samba share external USB device

    - by bioShark
    I managed to share stuff from my Ubuntu 12.04 to my private network, and the data is visible from a Windows machine. I even shared a hdd that has windows on it. So everything seems to work fine. When I want to share a mounted device (USB pen drive, USB HDD... etc) however, I get from the Windows machine: Access denied on file \... I realize that this is due to the missing rights on the mounted folder. By default a mounted folder gets the equivalent of 700 : drwx------, and the owner myself. But, I can't seem to change the rights on the external device... they remain 700. Is there a special trick I need to do in order to share NTFS mounted usb devices? Thanks P.S. from this Question I see that NTFS devices can not be shared....is this true? It's a bit strange, because I have in my PC 2 HDD's with 3 NTFS partitions, and I can share them without a problem.

    Read the article

  • Enabling GTX 570

    - by Silas
    Hello i just built up my new system: Asus Rock Z77 Extreme 4 Intel i7 3770k 16 Gb Corsair Ram Zotac Nvidia GTX 570 bequiet! 630W Power supply 120 GB SSD So after i installed UBUNTU 12.04 64 bit. It ran smoothly. I downloaded and installed all the recommended updates. After checking the Sytem details the GTX 570 didnt show up as graphics unit. so i figured i needed to download the drivers. So i did but being a complete newbie to linux i didnt succeed in installing them. (I think) Anyway after several tries and errors i shut down the PC and restarted it. Resultung in do Signal to my screen after trying to reboot and all the monitor outs with no result i took out the graphic card and now it boots normally but after booting it says there is a problem with the system the graphics cant be recognized something something. So Question A: What do i do? I Like linux but the arbitrarity of the Errors that occur without any changes to the system scare me. Question B: Is there A beginners guide to Ubuntu where i could start from scratch because i really want this to work? Question C: Now that the System is (suddently) showing these graphic errors So far without visible consequence despite the error message. should i reinstall the GPU and give the driver installation another try or the other way around? Ill be very grateful for any help. Thank you in advance!

    Read the article

  • Oracle Virtual Desktop Infrastructure

    - by Fat Bloke
    A lot of the recent blog entries here have been about Oracle VM VirtualBox, possibly the coolest personal desktop virtualization product known to man. Deploying VirtualBox on your PC or Mac lets you run many virtual desktops at the same time to one user, you. But did you know that VirtualBox can also power an Enterprise-scale virtual desktop deployment too, delivering many desktops to many users?  As part of another Oracle product, Oracle Virtual Desktop Infrastructure (VDI), VirtualBox can run your Windows, Linux or Solaris desktops on servers located in the datacenter. Oracle VDI orchestrates the whole deal by looking after : creating or cloning the virtual desktops from a master template; managing the lifecycle of the desktops (create, start, suspend, resume, stop, delete); assigning which users get which desktops;  delivering easy and fast access to these virtual desktops from almost any device, such as existing PCs or Macs, iPads, or specially designed Sun Ray client devices too; load balancing and session management of all of this.  Architecturally the solution looks something like this: This is an increasingly hot area of the IT landscape, so the Fat Bloke has decided to create a new blog category (VDI) and dedicate a few blog entries to look into this in a bit more detail over the next few weeks. Watch this space... - FB 

    Read the article

  • Re-installing Ubuntu without losing files, how to?

    - by moraleida
    Sometime back i bought a second PC to serve as my backup machine, but i've never managed to have it as i would like. Now i want to start over, but i've messed so much with it's disks that i'm kinda afraid to lose something on the way, thus this question. Right now, I have a 1Tb disk partitioned like this (as per GParted): /dev/sda1 (ext4) 346.12Gb - Is almost full, has an old install of Ubuntu 11.10. It no longer boots, ever since i installed Windows7 on /sda3. Everything that matters to me is tucked into /var/www/ all the rest can just go. /dev/sda2 (ext4) 196.45Gb - has an old install of 12.04 and nothing important, it's pretty much empty and also doesn't boot. /dev/sda3 (ntfs) 377.97Gb - is my boot partition with Windows 7, some important files and I'd like to keep it untouched. /dev/sda4 (extended) 10.97Gb - was created when i first installed Ubuntu, i think. In my ideal world, I'd like to safely reinstall Ubuntu from the 12.04 liveUSB and merge sda1 and sda2 without losing any files. Is that possible? How?

    Read the article

  • Error on /home prevents OS boot

    - by mdrg
    Ok, I'll try to explain it listing the events: motherboard battery is weak, loses config everytime I shut the PC down; system clock usually back to 1990-something; this time, it went up to 2148; ubuntu became somewhat unusable (firefox refusing certificates, dropbox blocked), so I restarted; manually set the clock to current date and restarted; during boot, ubuntu detects errors and start auto-fix, rebooting itself in the middle of it (like a hard reboot, no shutdown messages) next time, shows boot animation screen in text mode, with an error message after a while: "Errors were found while checking the disk drive or /home", with "attempt to fix, ignore, skip mounting or manual recovery" None of the options during the boot work, they just type the text to the screen, then I'm forced to physically restart. Now I'm using Mint LXDE live CD, and I can see all my partitions just fine, including /home (my home folder blocked due to encryption, the other user folder is accessible and everything in place). I'm not sure how to proceed now. I'd like to just fix ubuntu boot without reinstalling or anything like it (at least until 12.04). What should I do now? can I easily fix this somewhow? Thanks!

    Read the article

  • Purpose oriented user accounts on a single desktop?

    - by dd_dent
    Starting point: I currently do development for Dynamics Ax, Android and an occasional dabble with Wordpress and Python. Soon, I'll start a project involving setting up WP on Google Apps Engine. Everything is, and should continue to, run from the same PC (running Linux Mint). Issue: I'm afraid of botching/bogging down my setup due to tinkering/installing multiple runtimes/IDE's/SDK's/Services, so I was thinking of using multiple users, each purposed to handle the task at hand (web, Android etc) and making each user as inert as possible to one another. What I need to know is the following: Is this a good/feasible practice? The second closest thing to this using remote desktops connections, either to computers or to VM's, which I'd rather avoid. What about switching users? Can it be made seamless? Anything else I should know? Update and clarification regarding VM's and whatnot: The reason I wish to avoid resorting to VM's is that I dislike the performance impact and sluggishness associated with it. I also suspect it might add a layer of complexity I wish to avoid. This answer by Wyatt is interesting but I think it's only partly suited for requirements (web development for example). Also, in reference to the point made about system wide installs, there is a level compromise I should accept as experessed by this for example. This option suggested by 9000 is also enticing (more than VM's actually) and by no means do I intend to "Juggle" JVMs and whatnot, partly due to the reason mentioned before. Regarding complexity, I agree and would consider what was said, only from my experience I tend to pollute my work environment with SDKs and runtimes I tried and discarded, which would occasionally leave leftovers which cause issues throught the session. What I really want is a set of well defined, non virtualized sessions from which I can choose at my leisure and be mostly (to a reasonable extent) safe from affecting each session from the other. And what I'm really asking is if and how can this be done using user accounts.

    Read the article

  • I need help with Grub and restoring Windows?

    - by Bob Tahog
    I started out with Windows XP and then I installed Zorin (a sub distro of Ubuntu) and then I installed Ubuntu. This was working great. Then I installed Windows 8 on yet another partition and couldn't get into my other OSs. I asked my tech teacher at school how to fix it and she said just clear the partition that I installed Windows 8 on, so I booted onto a live version of Ubuntu and cleared the Windows 8 partition. Okay then I rebooted and it still went into Windows 8 for some reason. So I got back onto live Ubuntu and it turns out Windows 8 partition didn't clear for some reason so I did it again (and I'm positive it was the Windows 8 partition). I still couldn't fix grub but I needed something out of my XP partition so I mounted it on the live Ubuntu and now all the XP partition have are the folders 'Boot', 'Recovery', 'System Volume Information', 'temp' and the files 'bootmgr', 'BOOTNXT', 'BOOTSECT.BAK' and 'Recovery.txt'. Anybody know how to fix this or what I did wrong? Also, if I try booting from my hard drive it shows the Windows and says 'preparing automatic repair' then 'Diagnosing your PC' then restarts. Any and all help is greatly appreciated.

    Read the article

  • Blank screen during boot after clean Ubuntu 11.10 install (Intel N10 graphics)

    - by Coen
    After a clean install of Ubuntu 11.10 on my Asus eee PC 1005p, Ubuntu seems to boot correctly, except for initialization of the LCD screen. What I observe: I choose Ubuntu 11.10 in the GRUB 2 menu A blank screen with a blinking cursor in the top left of the screen, for 15-20 seconds. The ubuntu logo with 5 red dots in the center of the screen, for 1 second. The LCD screen is entirely blank The startup sound plays (Ubuntu is configured to auto-login) Still, the LCD screen is entirely blank. When I press Fn-F8 (the switch between LCD screen and external VGA), the LCD screen shows my desktop correctly and everything seems to work fine. Except for the adjust contrast buttons (Fn-F5 and Fn-F6), these seem to cycle through random brightness modes. Something like: 0% - 50% - 20% - 0% - 20% - 0% Any ideas what's causing this or how to solve this? coen@elpicu:~$ lspci -v 00:02.0 VGA compatible controller: Intel Corporation N10 Family Integrated Graphics Controller (prog-if 00 [VGA controller]) Subsystem: ASUSTeK Computer Inc. Device 83ac Flags: bus master, fast devsel, latency 0, IRQ 44 Memory at f7e00000 (32-bit, non-prefetchable) [size=512K] I/O ports at dc00 [size=8] Memory at d0000000 (32-bit, prefetchable) [size=256M] Memory at f7d00000 (32-bit, non-prefetchable) [size=1M] Expansion ROM at <unassigned> [disabled] Capabilities: <access denied> Kernel driver in use: i915 Kernel modules: i915 00:02.1 Display controller: Intel Corporation N10 Family Integrated Graphics Controller Subsystem: ASUSTeK Computer Inc. Device 83ac Flags: bus master, fast devsel, latency 0 Memory at f7e80000 (32-bit, non-prefetchable) [size=512K] Capabilities: <access denied>

    Read the article

  • Reliance on Outlook (been a looong time, I know)

    - by AndyScott
    Do you feel that your development group too reliant on Outlook? Have you reached a point that you have to search your email for pertinent information when asked? What are you using? I realized things had gotten out of hand a couple weeks ago over a weekend. I was at my in-laws house (in the country, no PC/laptop, no internet connection; and I get an email on my phone that I needed to reply to, but I couldn't send without deleting items from my inbox/sent items/etc. Now mind you, I have rules set up to move stuff into folders, and files more than a month old are automatically moved to the PST; but generally don't manually move items to a PST until I have had a chance to 'work' the item. Please don't bother mocking my process, it's just the way I work. That being said, it was a frustrating process of 'I need all this information, what can I afford to lose'. I work on an International project (think lots of customers), and conversations in 9 or 10 different directions about 10-20 different things are not abnormal for a given day. I have found myself looking data up in Outlook because that's where it is. I think that I have reached the point now, where I don't feel that Outlook is up to the task of organizing the data that it contains.   When you have that many emails (200 or so a day), information seems to get lost at times, and I find that Outlook's search capabilities are lacking. Additionally, I find that any sort of organizational 'system' of sorting emails that can cover multiple topics is a lost cause. But at the same time, the old process of taking the information that I got from emails and moving it into another 'notes' type of program has proved to be too time consuming. Anyone out there have some better type of system? (Comments about the capacity of my brain, and it's ability to recall information not needed.)

    Read the article

  • My home partition slowly fills up until the system is unable to complete even simple tasks

    - by user973810
    So I have an awesome configuration on my home pc. My /home directory has its own partition. My home partition slowly fills up until the system is unable to complete even simple tasks. For example, when this issue has occured, I can load up firefox. It just pops up an error message saying that it cannot be done. Rebooting solves the issue. The strange thing is, I've run baobab and it doesn't notice a problem. There should be hundreds of gigabytes of data somewhere, but it doesn't see it. Does anyone have any idea of how I might troubleshoot this issue? I'm thinking I could do lsof but I've always found the output of that to be too much information. Maybe my drive is, like, dying. Edit: is there a /home analog of /var that gets cleared out at boot time? Maybe I could check in there next time I notice this problem to see if I can divine what's up. Update: I found the issue. my .xsession-errors file is filling up with Authentication deferred - ignoring client message Is there a way I can see what is causing this error and fix it?

    Read the article

  • What to do about "system running in low-graphics mode"?

    - by ubuntubabe
    My Dell which was 5 years old suddenly karked it and I had the "low graphics" black screen and useless dialogue box. As I believed it was a dead graphics card I went out and bought a brand new machine. I put aside the new machine and tried again in vain to open the Dell. I eventually got to the command line via Ctrl+Alt+F1. I logged into my account from there, and simply started a series of sudo apt-get remove of various softwares that I knew were installed on my PC (software without any great consequence like Google earth, tweak, Skype etc). Lo and behold after a sudo reboot my computer was fine again! So now I have 2 computers. BUT one week after buying the other one and installing 12.04 because I love Ubuntu, the SAME PROBLEM arrived! I once again deleted Google earth, Skype, and did a sudo reboot and everything worked as before. I think there is a bug or something in 12.04 as this problem has never arisen with any other versions of Ubuntu.

    Read the article

  • Lubuntu 12.10: Icon Display Problems

    - by SlcBullseye
    First off, I would like to let you know that I am new to Linux. I finally decided to give it a try and my first project was installing Lubuntu on an old PC and using it for a media server for my PS3. So, one thing that I noticed right away is that my icons for applications, files, folders, etc...are not visible. If I move my mouse over the name of the application or file (It only works if I move the mouse up from the bottom of the name) then the icon will appear, but if I move my mouse back over any other place on the file the icon will disappear again. Also, if I open a folder sometimes an icon or two will be displayed but the rest won't, and like I said before if I move my mouse over the top of the application or file the icon will disappear. Is there any way to fix this? Is this normal? Or could it be a problem with my hardware? I never had this issue when I was running the last OS (Windows XP). Any help would be greatly appreciated. Remember, I am new to Linux so a thorough explanation would be helpful. Also, if there are any tips, tricks, references, or recommendations any of you have to help me jump in and become more familiar with using Linux that would be great! I'm very interested in taking advantage of the power Linux has. Currently I am studying computer programming so maybe one day I will be able to develop my own Linux distribution.

    Read the article

< Previous Page | 279 280 281 282 283 284 285 286 287 288 289 290  | Next Page >