Search Results

Search found 16131 results on 646 pages for 'splash screen'.

Page 166/646 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • Causes of hard crashes on Windows XP and how to debug

    - by Sam Brightman
    I am occasionally seeing hard lockups on XP: totally unresponsive to keyboard/mouse, screen freezes at time of crash, no SSH/VNC possible. Very intermittent, nothing in the logs. I never see a blue screen on any kind of error message. This morning I logged in via VNC, logged out again, 20 minutes later physically sat at PC and it crashed around the time of VNC logout. I tend to suspect video cards in these kind of situations but it's a modern-ish card with modern drivers (one revision back, but this has been happening for 5 revisions or more) and normally would at least see a blue screen I expect. What would you suspect? Where can I look or what can I set up for more information? Bear in mind that this happens about once every 3 or 4 weeks, so extensive logging or intrusive monitoring isn't really an option.

    Read the article

  • Architecture advice for converting biz app from old school to new school?

    - by Aaron Anodide
    I've got a WinForms business application that evolved over the past few years. It's forms over data with a number custom UI experiences taylored to the business, so I don't think it's a candidate to port to something like SharePoint or re-write in LightSwitch (at least not without significant investment). When I started it in 2009 I was new to this type of development (coming from more low level programming and my RDBMS knowledge was just slightly greater than what I got from school). Thus, when I was confronted with a business model that operates on a strict monthly accounting cycle, I made the unfortunate decision to create a separate database for each accounting period. Also, when I started I knew DataSets, then I learned Linq2Sql, then I learned EntityFramework. The screens are a mix and match of those. Now, after a few years developing this thing by myself I've finally got a small team. Ultimately, I want a web front end (for remote access to more straight up screens with grids of data) and a thick client (for the highly customized interfaces). My question is: can you offer me some broad strokes architecture advice that will help me formulate a battle plan to convert over to a single database and lay the foundations for my future goals at the same time? Here's a screen shot showing how an older screen uses DataSets and a newer screen uses EF (I'm thinking this might make it more real for someone reading the question - I'm willing to add any amount of detail if someone is willing to help).

    Read the article

  • Ubuntu Nvidia Xorg Twinview doesnt like my monitors

    - by Andrew Bolster
    Basically, using the latest available ubuntu drivers (195.36.15) I cannot for the life of me get my two monitors to operate at suitable resolutions. When not using the drivers atall and going single-screen, both monitors support 1680x1050, but this option is only shown for one monitor in nvidia-settings, and when i manually add a metamode into the xorg.conf, it just gives up initialising the second screen. (**) Mar 25 15:49:47 NVIDIA(0): TwinView enabled (II) Mar 25 15:49:47 NVIDIA(0): Assigned Display Devices: CRT-0, CRT-1 (II) Mar 25 15:49:47 NVIDIA(0): Validated modes: (II) Mar 25 15:49:47 NVIDIA(0): "1680x1050,1680x1050" (II) Mar 25 15:49:47 NVIDIA(0): Virtual screen size determined to be 1680 x 1050 Any ideas?

    Read the article

  • Ubuntu Nvidia Xorg Twinview doesnt like my monitors

    - by Andrew Bolster
    Basically, using the latest available ubuntu drivers (195.36.15) I cannot for the life of me get my two monitors to operate at suitable resolutions. When not using the drivers atall and going single-screen, both monitors support 1680x1050, but this option is only shown for one monitor in nvidia-settings, and when i manually add a metamode into the xorg.conf, it just gives up initialising the second screen. (**) Mar 25 15:49:47 NVIDIA(0): TwinView enabled (II) Mar 25 15:49:47 NVIDIA(0): Assigned Display Devices: CRT-0, CRT-1 (II) Mar 25 15:49:47 NVIDIA(0): Validated modes: (II) Mar 25 15:49:47 NVIDIA(0): "1680x1050,1680x1050" (II) Mar 25 15:49:47 NVIDIA(0): Virtual screen size determined to be 1680 x 1050 Any ideas?

    Read the article

  • xorg, nvidia, log-in all hosed - how can I completely reset graphics set-up/settings?

    - by Fred Hamilton
    I just did a fresh install of Mythbuntu 12.04.1 on my Intel MB with nVidia 9500GT graphics card. Hardware's been working great with 10.10 for about 2 years. Background: (optional - feel free to skip to question) I was trying to get my component video output to generate 720p, messing around with the nvidia drivers, and now the entire display system is hosed. I can SSH in and get a terminal. Depending on which nvidia package I install/remove, I get: Garbage on screen (after I "apt-get remove nvidia*") A low-res graphical log-in screen where I can log in as fred or guest. If I log in as fred, it displays some text mode status line then goes right back to the log-in screen. If I log in as guest, I actually get the full Ubuntu desktop, but I need to be able to log in as fred. Other times I get an error: "API mismatch: the NVIDIA kernel module has version 304.43, but this NVIDIA driver component has version 295.49." I've googled around, including trying this thread with the same error message, but to no effect. Question: How can I just reset x settings, drivers, everything display-related to the exact same way it was after a fresh install?

    Read the article

  • External monitor is blank if I boot with the monitor plugged in

    - by Ronald
    Ubuntu 12.04 has problem with Intel GM45 Chipset, featuring the the Mobile Intel® Graphics Media Accelerator (GMA) 4500MHD I have a COMPAQ Presario CQ70 laptop with an Intel GM45 chipset that features the Mobile Intel® Graphics Media Accelerator (GMA) 4500MHD. I was using the second HDMI video port to drive both a projector or a second monitor. Everything was working fine under Ubuntu 8.04, 9.04, 10.04 and 11.04, however, when I upgraded to 12.04 the second monitor stopped working. What I mean when I say stopped working is: boot with monitor plugged in. Blank screen! Power off, unplug monitor and power on everything works. Plug in monitor (only mode that works in Mirror mode) two monitors that look same. Close laptop lid. screen goes blank only option for useful system is power off and unplug monitor. If I attempt to Adjust the monitor to maximum resolution that the monitor will handle and turn off mirror mode nothing can be moved onto that screen. This all worked fine with earlier versions of Ubuntu, is there a notes about the changes to the graphics management system in 12.04, like there is for the resolver change?

    Read the article

  • TightVNC (or any VNC) viewer windows scaling

    - by mr.b
    Hi, I am currently using TightVNC to connect to multiple remote hosts in LAN. I start 16 VNC viewers, set Scaling by: Auto (in connection options display), and then select all viewer windows and use Tile Horizontally, which covers my entire screen with VNC viewers. It all works sort of nice, except that desktop interaction is really slow when there are more then 4 VNC viewers. My question is, does VNC client (not just TightVNC, but any compatible client) support some kind of smart scaling option, so that client tells server something along the lines of: "Okay, I'm displaying your entire screen in a window size 300x225 px, so can you please start sending encoded images on that resolution?", at which point interactiveness of open connections dramatically increase, and when I decide to go full screen on some connection, client and server re-negotiate and server starts sending full resolution images again? Thanks!!

    Read the article

  • Startup/Shutdown time in Xubuntu is increasing!

    - by Ankit
    I am a novice Xubuntu user on a dual-boot machine. The other OS I have is Windows 7. When I first began using Xubuntu, I had really fast startup and shutdown (much much faster than Windows 7 :) ). However, as I started using it more and more for my work, these times started rising. I do not have any problems with execution speed of running applications. My main concern is the shutdown time. Now it has gone above Windows shutdown time [startup time has only partially increase compared to shutdown]. I checked some similar questions like this. However, they seem to not answer my concern as I feel that the concerned users there experience a long wait before the screen goes blue. In my case, the screen goes blue (desktop session ends a blue screen with a moving slider appears) pretty fast. However, it remains blue for a long time. Another answer that I saw on google was to use dmesg and then stopping some services that I do not want. However, me being a novice could not completely understand what it meant

    Read the article

  • Could not apply the stored configuration for the monitor

    - by dellphi
    I'm using Nvidia 7300 gt and monitor-Acer V173w, on 64 bit Ubuntu 10.04. Compiz and Emerald went well, but at the time of entry into the GUI, I always receive the message : "Could not apply the stored configuration for the monitor, could not find a suitable configuration of screens" Why do I always receive it, and what is wrong with the monitor configuration or pci-e is used? root@dellph1-desktop:/# xrandr Screen 0: minimum 320 x 240, current 1440 x 900, maximum 1440 x 900 default connected 1440x900+0+0 0mm x 0mm 1440x900 50.0* 1024x768 51.0 58.0 59.0 1360x768 52.0 53.0 1152x864 54.0 55.0 56.0 57.0 960x600 60.0 960x540 61.0 896x672 62.0 840x525 63.0 64.0 65.0 66.0 832x624 67.0 800x600 68.0 69.0 70.0 71.0 72.0 73.0 800x512 74.0 720x450 75.0 680x384 76.0 77.0 640x512 78.0 79.0 640x480 80.0 81.0 82.0 83.0 576x432 84.0 85.0 86.0 87.0 512x384 88.0 89.0 90.0 416x312 91.0 400x300 92.0 93.0 94.0 95.0 320x240 96.0 97.0 98.0 root@dellph1-desktop:/# === xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 260.19.29 ([email protected]) Wed Dec 8 12:27:27 PST 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: xconfig, VertRefresh source: xconfig Identifier "Monitor0" VendorName "Unknown" ModelName "Acer V173W" HorizSync 30.0 - 83.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7300 GT" EndSection Section "Screen" # Removed Option "metamodes" " 1440x900_60 +0+0; 1280x1024 +0+0" # Removed Option "metamodes" "1440x900 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1440x900_75 +0+0; 1440x900 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Macbook Pro Multiple Monitor Problem

    - by thinksocrates
    I have been using a Macbook pro (newest model) for about 4 weeks with dual monitors. It has been working great using the mac adaptor to dvi. Today however, my mac will not recognize it's built in monitor while the second monitor is plugged in. Plug in second monitor The screen on the laptop goes dark. The second monitor acts as the main screen. Click "Detect displays". Nothing happens Unplug the second monitor Screen on the laptop comes on. Any thoughts?

    Read the article

  • Ubuntu installation on iMac

    - by Shanew
    I have an iMac configured as follows - Screen 27" CPU 3.4 GHz i7 Graphics AMD Radeon HD 6970 1024 MB I downloaded Ubuntu version 11.10 64 bit ISO and burnt that to both DVD and USB stick as per the instructions on Ubuntu's download page. Neither will boot. Symptoms are as follows - DVD: When the iMac is restarted and booted from DVD (labelled Windows which isn't mentioned in Ubuntu's website instructions) one line is displayed against a black screen displaying a message about the developer and date. After 5 minutes the message hangs and the DVD ceases to spin. USB Stick: Strangely I have to select the EFI boot CD icon which appears after holding down the Alt key. A text menu appears offering me to try Ubuntu without installing. I select this and the screen goes blank and stays blank. Any ideas? Lastly, after writing Ubuntu to DVD and USB stick, neither could be read by OSX making the instructions to eject them as per Ubuntu website's instructions useless. This might help? Thanks, Shane.

    Read the article

  • Hot video card in server

    - by DougN
    Not sure if this belongs here or Superuser (I looked at Superuser -- suspect there are more hardware gurus here). I have a server that sits in a cabinet. It's connected to a small screen that is normally off. However, the video card is running at about 210 F all the time. The rest of the PC is pretty cool (getting temps from SpeedFan). Any thoughts on a way to quiet/calm/cool the video card since it's never really doing anything anyway? I'm usually logged out on the server, and no screen saver defined. Windows is already set to turn off the screen for power saving at 5 minutes.

    Read the article

  • Gnome 3 - Multiple Video Cards - Xinerama -- Forced Fallback Mode

    - by Alvin
    Just installed a 2nd nvidia video card -- previously had gnome 3 working perfectly with 2 monitors on a a single video card using twinview tried a number of things thus far twinview on 1 card + xinerama no xinerama no twinview various manual xorg.conf hacks based on random forums (couple references below) xinerama no twinview with and without Extensions Composite The last one is what I'm using now -- it results in a forced fallback mode with Composite Disable set at the end of xorg.conf via nvidia-settings Section "Extensions" Option "Composite" "Disable" EndSection when I disabled that last snippet it boots to gnome 3 full with the left monitor on a black screen and the middle monitor as primary but non-responsive switching to console mode Ctrl+Alt+F1 and then switching back I get 3 black screens with a mouse that can move around but nothing to interact with issue seems related to OpenGL and the multiple video cards -- I can boot into Unity without issue though my Glx-Dock shows up with the black background as barely shows in the screenshot below indicating the OpenGL is not initiated has anyone had any luck with getting Xinerama to work with Multiple NVidia Video Cards with OpenGL support? Found this in the logs while looking a bit further [ 23.208] (II) NVIDIA(1): Setting mode "nvidia-auto-select+0+0" [ 23.254] (WW) NVIDIA(1): The GPU driving screen 1 is incompatible with the rest of the [ 23.254] (WW) NVIDIA(1): GPUs composing the desktop. OpenGL rendering will be [ 23.254] (WW) NVIDIA(1): disabled on screen 1. [ 23.277] (==) NVIDIA(1): Disabling shared memory pixmaps [ 23.277] (==) NVIDIA(1): Backing store disabled [ 23.277] (==) NVIDIA(1): Silken mouse enabled [ 23.277] (==) NVIDIA(1): DPMS enabled According to this page at the NVidia User Docs http://us.download.nvidia.com/XFree86/Linux-x86/173.14.09/README/chapter-14.html I may be out of luck =( Starting this question with the hopes that others may be able to help debug and perhaps gain answers over time as I really want to get the full gnome 3 back.

    Read the article

  • Using gluLookAt to move camera in 2D iPhone game ?

    - by Mr.Gando
    Hey guys, I'm trying to use gluLookAt to move the camera in my iPhone game, but every time I've tried to use gluLookAt my screen just goes "blank" ( grey in this case ) I'm trying to render a simple triangle and to move the camera, this is my code: to setup my scene I do: glViewport(0, 0, backingWidth, backingHeight); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glRotatef(-90.0, 0.0, 0.0, 1.0); //using iPhone in horizontal mode glOrthof(-240, 240, -160, 160, -1, 1); glMatrixMode(GL_MODELVIEW); then my "triangle rendering" code looks like: GLfloat triangle[] = {0, 100, 100, 0, -100, 0,}; glClearColor(0.7, 0.7, 0.7, 1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnableClientState(GL_VERTEX_ARRAY); glColor4f(1.0, 0.0, 0.0, 1.0); glVertexPointer(2, GL_FLOAT, 0, &triangle); glDrawArrays(GL_TRIANGLES, 0, 6); glDisableClientState(GL_VERTEX_ARRAY); This draws a red triangle in the middle of the screen, when I try to apply gluLookAt ( I got the implementation of the function from Cocos2D so I asume it's correct ), i do: glMatrixMode(GL_MODELVIEW); glLoadIdentity(); gluLookAt(0,0,1,0,0,0,0,0,1); // try to move the camera a bit ? GLfloat triangle[] = {0, 100, 100, 0, -100, 0,}; glClearColor(0.7, 0.7, 0.7, 1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnableClientState(GL_VERTEX_ARRAY); glColor4f(1.0, 0.0, 0.0, 1.0); glVertexPointer(2, GL_FLOAT, 0, &triangle); glDrawArrays(GL_TRIANGLES, 0, 6); glDisableClientState(GL_VERTEX_ARRAY); This leads me to grey screen (glClearColor is grey), I've tried all sort of things and read what I've found about gluLookAt on the net, but no luck :(, if someone could explain me or show me how to move to move the camera in a top-down fashion ( zelda, etc ), I would really appreciate it. Thanks!

    Read the article

  • libgdx game not disposing

    - by Yesh
    My game does not exit entirely even after calling dispose() method. It loads a black screen when I launch it for the second time and works well if I kill the game manually and restart it. I get an error that says buffer not allocated with newUnsafeByteBuffer or already disposed when I try to dispose off the SpriteBatch object. This is were I suspect the problem to be. But not able to fix it entirely. Please help! Here is how I have built it (I have put the sample code here just to show you guys that there are no visible loop backs in dispose function, please correct me if I'm wrong)- In game screen, public void dispose() { AssetLoader.dispose(); render.dispose(); Gdx.app.exit(); } Under class AssetLoader- public void dispose(){ Texture.dispose(); sound.dispose(); } Under game render class - public void dispose(){ spritebatch.dispose(); //throws an error when I GameScreen.dispose is called font.dispose(); shaperender.dispose(); } I believe that my spritebatch isn't disposing which is causing the black screen but I cannot find a way to dispose it off successfully. Any help would be greatly appreciated.

    Read the article

  • linux display drivers

    - by salman
    I've run into a major display problem on newly installed fedora 11, on my 6 years old pc which runs a pentium4 2.4 GHz processor, 1 gb ddr ram, intel 845 motherboard with integrated graphics card. When i open an image or play a video, my complete screen turns garbled. I simply cannot make out whats on my screen. With difficulty i have to close the image/video window and move around the folder window to clean the screen image. Is it because of my display drivers? How can i fix it? I also ran into mp3 plugins and flash issues which i was able to resovle. I'm new to linux, the sole purpose of isntalling it on my old pc was to learn linux but this display problem is frustrating me. Thanks, Salman

    Read the article

  • How can I cull non-visible isometric tiles?

    - by james
    I have a problem which I am struggling to solve. I have a large map of around 100x100 tiles which form an isometric map. The user is able to move the map around by dragging the mouse. I am trying to optimize my game only to draw the visible tiles. So far my code is like this. It appears to be ok in the x direction, but as soon as one tile goes completely above the top of the screen, the entire column disappears. I am not sure how to detect that all of the tiles in a particular column are outside the visible region. double maxTilesX = widthOfScreen/ halfTileWidth + 4; double maxTilesY = heightOfScreen/ halfTileHeight + 4; int rowStart = Math.max(0,( -xOffset / halfTileWidth)) ; int colStart = Math.max(0,( -yOffset / halfTileHeight)); rowEnd = (int) Math.min(mapSize, rowStart + maxTilesX); colEnd = (int) Math.min(mapSize, colStart + maxTilesY); EDIT - I think I have solved my problem, but perhaps not in a very efficient way. I have taken the center of the screen coordinates, determined which tile this corresponds to by converting the coordinates into cartesian format. I then update the entire box around the screen.

    Read the article

  • Input not supported message when monitor is powered off then back on

    - by Jason Down
    I've been getting a message on my monitor where "Input not supported" is floating around. This only happens when I manually turn the monitor off and then later turn it back on. Leaving the monitor on and allowing it to go to the screen saver doesn't seem to cause the issue (but I prefer to turn the monitor off if I'm going to be away from the computer for any length of time). Any ideas what might cause this, only when the monitor is turned off manually? Specs: Acer X203w mointor Radeon 9600 Pro Video card Linux Mint 8 Resolution 1680 x 1050 (16:10 - Preferred native resolution for the monitor) Refresh Rate 60hz Here is what is in my xorg.conf file: Section "Device" Identifier "Radeon 9600" Driver "ati" BusID "PCI:1:0:0" Option "XAANoOffscreenPixmaps" Option "AccelMethod" "XAA" EndSection Section "Screen" Identifier "Default Screen" Device "Radeon 9600" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1680x1050" "1440x900" "1024x768" EndSubSection EndSection Section "DRI" Mode 0666 EndSection Section "Extensions" Option "Composite" "Enable" EndSection

    Read the article

  • Reboot failure after upgrade from 8.04 LTS to 10.04 LTS

    - by Alan Fietz
    I bought our computer from Freegeeks with Ubuntu 8.04 installed. I upgraded from Ubuntu 8.04 to 10.04 on Thursday November 10. I have an ASUS P4P800SE with dual Intel P4@3GHZ. Installation messages were: - Error loading Nautilus config info - Replaced customied /etc/login.defs - Replaced customized /etc/dhcp3/dhclient.conf - 189 packages removed - WARNING: Failed to read mirror file When I rebooted, the usual ASUS screen appeared, then "Loading GRUB" then "starting Up..." then "starting Up..." again then a blank screen (the moniter went dormant). I rebooted, started GRUB and selected: version 10.04.3 LTS kernel 2.6.32-35 generic I got the same results. I rebooted, started GRUB and selected: kernel 2.6.24-29 generic Here's what was displayed: udevd [875]: error getting socket: Invalid argument libudev:udev_monitor_new_from_netlink: error getting socket: Invalid argument Segmentation fault **Gave up waiting for root device** Common problems - Boot args (cat/proc/cmdline) - Check root delay - check root - Missing modules (cat/pro/modules; **Alert! /dev/disk/by_vvid/c59c6361 etc... does not exist. Dropping to a shell.** Then Busybox v1.13.3 started with the following prompt (?) (initramfs) _ But my typing did not appear on the screen. It appears the hard drive cannot be found. Any suggestion on how to remedy this? Thank you.

    Read the article

  • How do I stop ubuntu from detaching minimize/maximuze/close buttons?

    - by Shahbaz
    Some time ago I managed to get ubuntu to keep the window menubars in the menu rather than the bar above (I'm not sure if this part is unity or compiz, or what's the difference). That was by removing indicator-appmenu Anyway, so now everything is fine except one thing: If I have a window that is full screen, the minimize/maximize/close buttons are still grabbed by the bar on the top. Usually this doesn't cause a problem because the upper-left corner of the full screen window and the whole screen are not too far apart. However, one thing happens to me a lot, and that is I am working on something (programming), then I need to check some things from other places so I open some windows, see what I want and switch back to my work. Those windows however are temporary so at some point I want to close them. Now here's what happens: I have the focus on some window and I can't close the maximized window behind it unless I click on the window first, so that the buttons appear and then close it. I couldn't find anything on the internet about this. Is this something that's hardcoded in unity/compiz/whatever or is there actually a way to configure this?

    Read the article

  • Sudden restart when Ubuntu almost loaded from disc.

    - by Wesley
    Hi all, Here are the specs beforehand: ECS K7SEM motherboard AMD Duron 900MHz 2x 256MB PC133 SDRAM The Best Power MT-500P 500W PSU Integrated graphics No hard drive DVD-ROM - will update with brand & model Now, I was starting up this machine after it was left outside for 3 months in winter weather. (I got this from a friend.) I was able to get it started and tried to load Ubuntu from the DVD-ROM. It was fairly successful and got up to the point when the Ubuntu logo is glowing. However, when Ubuntu was about to go to the main screen, the computer crashed and automatically rebooted. Is there any reason why this is happening? Also, I should mention that when I try to hit Delete on the BIOS screen to go to Setup, it only shows a screen with four lines saying something Novell something... I will edit with exact lines. Should I be resetting the BIOS or something? Thanks in advance.

    Read the article

  • How do I disable the fade out/fade in effect when unlocking a WIndows 7 workstation?

    - by Timwi
    When I press Win+L, the “Locked” screen (with the password prompt) appears immediately. That’s nice, but not terribly important: I’m probably leaving the computer anyway. But after I type the password (to unlock the workstation), the desktop doesn’t appear immediately: instead, the “Locked” screen slowly fades out, the desktop slowly fades in, wasting my time, and all keys (e.g. Win+R) pressed during this interval are completely swallowed, forcing me to wait unnecessarily. This is extremely annoying because when I unlock the workstation, I generally want to use my computer. How do I disable this fade out/fade in effect and have the desktop appear immediately, in the same way that the “Locked” screen appears immediately?

    Read the article

  • autostart app with tag in awm

    - by nonsenz
    while giving awm a try i encounter some problems. i want to autostart some apps when awm is started with specific tags. here's the relevant config i use for that. first my tags with layouts: tags = { names = {"mail", "www", "video", "files", 5, 6, 7, 8, 9}, layout = {layouts[11], layouts[11], layouts[11], layouts[11], layouts[1], layouts[1], layouts[1], layouts[1], layouts[1]} } for s = 1, screen.count() do -- Each screen has its own tag table. tags[s] = awful.tag(tags.names, s, tags.layout) end now the app-autostart stuff: awful.util.spawn("chromium-browser") awful.util.spawn("firefox") awful.util.spawn("vlc") awful.util.spawn_with_shell("xterm -name files -e mc") awful.util.spawn_with_shell("xterm -name 5term") awful.util.spawn_with_shell("xterm -name 5term") awful.util.spawn_with_shell("xterm -name 5term") awful.util.spawn_with_shell("xterm -name 5term") awful.util.spawn_with_shell("xfce4-power-manager") i use xterm with the -name param to give them custom classes (for custom tags via rules). and now some rules to connect apps with tags: awful.rules.rules = { -- All clients will match this rule. { rule = { }, properties = { border_width = beautiful.border_width, border_color = beautiful.border_normal, focus = true, keys = clientkeys, buttons = clientbuttons } }, { rule = { class = "MPlayer" }, properties = { floating = true } }, { rule = { class = "pinentry" }, properties = { floating = true } }, { rule = { class = "gimp" }, properties = { floating = true } }, -- Set Firefox to always map on tags number 2 of screen 1. -- { rule = { class = "Firefox" }, -- properties = { tag = tags[1][2] } }, { rule = { class = "Firefox" }, properties = { tag = tags[1][2] } }, { rule = { class = "Chromium-browser" }, properties = { tag = tags[1][1] } }, { rule = { class = "Vlc"}, properties = { tag = tags[1][3] } }, { rule = { class = "files"}, properties = { tag = tags[1][4] } }, { rule = { class = "5term"}, properties = { tag = tags[1][5] } }, } it works for chromium, firefox and vlc but not for the xterms with the "-name" param. when i check the xterms after they started with xprop i can see: WM_CLASS(STRING) = "5term", "XTerm" i think that sould work, but the xterms are placed on the first workspace/tag.

    Read the article

  • ubuntu 10.04 console resolution

    - by Ove
    I have installed Ubuntu 10.04 on my HP Pavilion dv6000 After I installed it, the text in the console (when I press ALT+F1) was small and the console had a good resolution (I think the same as my LCD, 1280x800). Also, at boot, the "Ubuntu" logo was small and centered in the middle of the screen. That was good. After that, I installed the nVidia driver via the "System-Administration-Hardware drivers" screen. After the driver was installed, the text in the console was larger and more pixelated, and also the "Ubuntu" logo was much larger and looked uglier because it was pixelated. Can anyone help me change the resolution in the console and boot screen back to what it was before I installed the nVidia driver?

    Read the article

  • How can I make my monitor run at it's native resolution under Kubuntu 9.10?

    - by Adam Matan
    Hi, I have installed Kubuntu 9.10 afresh on an HP desktop computer with a Samsung SyncMaster 2243 and Intel integrated graphics card. The screen resolution is fixed on 1280x1024 instead of the native 1680x1050, which makes my eyes bleed. $ lspci -k |grep "VGA" -A2 00:02.0 VGA compatible controller: Intel Corporation 82G33/G31 Express Integrated Graphics Controller (rev 10) Kernel driver in use: i915 Kernel modules: i915 and my xorg.conf: /etc/X11$ cat xorg.conf Section "Device" Identifier "Configured Video Device" Driver "vesa" EndSection Section "Monitor" Identifier "Configured Monitor" EndSection Section "Screen" Identifier "Default Screen" Monitor "Configured Monitor" Device "Configured Video Device" EndSection Any ideas how to make this driver work? I found no working solutions on Google searches. Thanks, Adam

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >