Search Results

Search found 104331 results on 4174 pages for 'off by one'.

Page 30/4174 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Laptop abruptly powers off after few seconds of booting

    - by Alan Mendelevich
    I have a 3 year old HP Pavilion dv2208 laptop. Recently it started abruptly powering off in like ~20-30 seconds into Windows boot sequence after almost every reboot/shutdown. Even if I leave it in Repair/Start Windows Normally stage it powers off anyway. The only way I managed to workaround this is to enter BIOS setup screen and leave it on for no less than 10 minutes. I don't know what happens there but this helps every time. Any ideas of possible ways to fix this that don't include replacing motherboard are highly appreciated. P.S.: I've tried resetting BIOS to defaults, updating to the latest BIOS version, etc. Happens with both Vista and Windows 7.

    Read the article

  • Windows 7 trying to turn off UAC every time Windows starts

    - by Mehper C. Palavuzlar
    I have strange problem on my HP laptop. This began to happen recently. Whenever I start my machine, Windows 7 Action Center displays the following warning: You need to restart your computer for UAC to be turned off. I never disable UAC, but obviously some process or virus (I'm not sure, only guessing) causes this. As soon as I get this warning, I head for the UAC settings, and re-enable UAC to dismiss this warning. This is a bothersome situation as I really don't know what causes the problem. I have run a full scan on the computer for any probable virus activity, but TrendMicro OfficeScan said that no viruses have been found. There are no other strange incidents on the machine. Everthing works fine except this bizarre incident. How can I learn what process is trying to turn off UAC? What way should I follow to overcome this problem?

    Read the article

  • Logging another person off in Windows 7 using Task Manager

    - by BBlake
    Under WinXP, I could use Task Manager's Users tab to log off my wife's account which she always leaves logged in so I don't have to log in to her account and log it out. It's an older machine so I used that trick to free up every resource I could which might potentially slow down the game I'm playing at the time. I recently upgraded the machine to Win7 and when I try the same trick, I get an access denied popup. My logged in account does have Admin rights, so is it as simple as runing Task Manager "as an Administrator" in order to allow this? If so, how can I pull up Task Manager (other than the standard CTRL-ALT-DELETE) to have it pop up with Admin rights in order to log her account off in this manner?

    Read the article

  • PC will be turned off in safe mode

    - by abbasi
    I wanted to scan a Dell inspiron N5010 laptop in safe mode. When I took it to the safe mode environment and ran the AV to scan, after some while the PC went out! The power options menu options (when plug is in) are as follows: Screensaver after 4 mins. Dim to display after 10 mins. Turn off display after 15 mins. Hard disk, sleep and hibernate are all off (set to never). I tested the machine without scanning it, i.e, went to the safe mode and waited to see what occurs after time elapsing. By observations so far (because I can apply more tests to determine that where the problem exactly is), the problem is when the machine is in safe mode and is doing something (in current case, scanning). What do you think? Why it goes out in this situation please? Thanks in advance.

    Read the article

  • Monitor turns off about 1 second after turning it on

    - by r0ca
    Hi all! I have a hardware issue with some of the LCD monitors we have in our office. My problems is not related to video cards or anything else with the computer itself. I have 2 Dell 17" LCD screens that goes off (blank) after 1 or 2 seconds. The light remains green so it's not idling or sleeping. Just this morning, I had to replace one that goes off on 2 workstations and also on my laptop. Nothing to do with it. I tried the VGA and DVI connection w/out luck. I strongly think that this is something with a capacitor or something inside the screen but I can't figure out what it is... Is anybody heard of that kind of issue before? Regards, David.

    Read the article

  • Internet access only works after turning off and on the local connection

    - by AgentFire
    Every time I turn on the computer, it has no internet access. The network card is connected to a router which broadcasts the internet all over the office LAN. Once I turn off and then on the local connection (or network card), the internet access works. Rebooting and logging out/in again does not help. Only the turning off and on the network card gives me internet access. Why is that? How do I fix it?

    Read the article

  • Sony Bravia KDL-32L5000 PC resolution slightly off

    - by user18818
    I have a PC running two Nvidia 8500 GTs in SLI mode and I am trying to use my TV in dual mode. When I switch the TV to PC the screen is nearly centered with a slight offset. All resolutions are effected from 800x600 all the way up to the TVs native 1360x768. I have tried with SLI on and off and have PhysX turned off as well as I thought that might have an effect. I am running Windows XP 64-bit SP2 DirectX ver 9.0c Nvidia driver version 181.22 Any other information please let me know. Thanks in advance.

    Read the article

  • Computer turns off and on after start ..then goes dead

    - by Shiki
    I built a new PC from the following components: - CPU: Intel Core i7 950 - MB: Gigabyte X58A-UD3R - RAM: 2x2gb i7 Corsair memory - VGA: Zotac AMP2 GTX260 - HDD: 1 GreenSATA HDD (Western Digital 500gb RE2) When I turn it on, it goes for a few seconds, fans at maximum speed, then turns off. The again, it starts by itself.. and goes with fans on max speed, nothing happens. First I suspected my PSU. It's a Chieftec 450AA PSU. After I borrowed a Chieftec 550AA PSU, I tried to start with that. Exact same story. Any idea ? Do I need a bigger PSU? Reason why its not localized. I never seen this turn on, off, on. If you give answer for that, it would already help people like me, with the same problem.

    Read the article

  • WinXP on VPC - Unable to change the way users log on or off

    - by kamleshrao
    On my Win7 computer, I have setup a new Win-XP VPC. In the VPC window, when I click Ctrl+Alt+Del, it shows me Windows Task Manager. As per MS KB [ http://support.microsoft.com/kb/281980 ], we can change this behavior to show the regular Windows Security window. But while making this change, I am getting the following error: User Accounts Fast User Switching cannot be turned off from a remote connection to this computer. Log on to the computer locally to turn off Fast User Switching. OK Is there any way I can fix this setting?

    Read the article

  • Windows 8.1 backlight turns off

    - by tenhouse
    This is one of the strangest things I have ever seen. I have an HP Elitebook 8460p and recently updated my Windows 8 to Windows 8.1. Now when I have certain windows open and focused (it's always different, sometimes it's chrome, sometimes it's the control-panel and so on, there is no clear pattern) the backlight of the screen is completely turned off - when I take a torch and point it to the screen or hold it to a bright-light source I can see there is still a picture there you can even work (or better said you cuold if you'd see anything). When I tab to certain other windows the screen goes back to normal back light - I can then tab back to the other window and the backlight will be turned off again. I have no idea what's going on and also no idea what information you need. The display-adapter I got is the AMD Radeon HD 6400M - I tried updating to the newest driver but that didn't help. Here's a video of the whole thing: http://www.youtube.com/watch?v=kIV_Q8uayUA

    Read the article

  • Windows 7/Outlook 2010 cut off controls on form windows

    - by D..
    I am running windows 7 with multiple monitors. 2 at 1920 x 1200 and 1 at 1600x900 resolution. Controls are cut off and I cannot view the entire content of the window. As far as I can tell I only have the issue with Outlook 2010, it may be present in other applications but I haven't noticed it. For example to get to the more settings button I have to use tab and then click enter. The more settings button is never visible. I have used applications that allow you to force windows to be resizable, however the anchoring is such that it remains cut off. This has been present over multiple clean installs on the system. My DPI is set to 100% unlike this issue.

    Read the article

  • "Turn Off The Lights" on any website?

    - by gojira
    On Youtube, there is this nice button (easy to overlook - top left of the video) which lets one "turn off the lights": the site background changes from white to black, the text color changes from black to grey. There is an unrelated plug-in for Firefox called "Turned Off The Lights", which has a very similar functionality. This makes websites so much easier to read. However, both technologies only work on YouTube. Is there anything to achive the same effect for all websites? Preferably with Firefox? I.e.: I want to have very dark background and light text color on all websites viewed with Firefox, how can I do that?

    Read the article

  • Logging off does not kill process in Windows Server 2003

    - by user25951
    I have a Windows Server 2003(Enterprise, SP2). My understanding was that any process created by a user will be terminated when the user loggs off the account. But its not happening. I login via Administrator account. Start a simple java process and logoff. But the process is not killed. Is there any configuration for this or something? I am mostly a software programmer and not much in to servers and so I am stuck. I found out that while logging off, 1) Win32 is supposed to send a CTRL_LOGOFF_EVENT to all processes started by that user. 2) JVM is supposed to handle this event and terminate the VM. But I can't understand why my java process is not killed when i logoff. Any idea!!!

    Read the article

  • Advice on off-site backup of Hyper-V Failover Cluster

    - by Paul McCowat
    We are currently setting up a Server 2008 R2 which will be off-site over a leased line with VPN. At the main site is 2 x Hyper-V hosts in a failover cluster with PowerVault M3000i iSCSI SAN. We are using BackupAssist for local backups and each host backups up itself and it's guests nightly creating a 500GB backup each which is copied to a 2TB rotated NAS drive. Files and SQL DB's are also backed up / log shipped etc. Looking for the best way to backup the Hyper-V VM's and copy them off-site so that the OS's are only a month old and the data is a day old. The main backups are too large to transfer between backups so options discussed so far are: Take rotating individual backups of the VM's each day and copy over, Day 1 SQL VM, Day 2 Exchange VM etc, would require more storage. Look in to Hyper-V snapshots, however don't believe these are supported in clustering. 3rd party replication tools

    Read the article

  • After turning my monitor off and on, it will display only a white screen

    - by Narf the Mouse
    About a month after installing a new graphics card, I started encountering a rather frustrating problem. Namely, if I turn my monitor off for any significant length of time, then turn it back on, it displays only a white screen. Previously, restarting could fix the problem. However, after leaving the computer off last night, the problem persists. An internet search turned up this site; however, the monitor cable is not loose. As for the insides of the monitor - Well, I could poke around, but I risk making it worse if it's not the monitor. Any such instructions should be clear, detailed and include pictures. Further updates as events warrant.

    Read the article

  • PowerShell One Liner: Duplicating a folder structure in a Sharepoint document library

    - by Darren Gosbell
    I was asked by someone at work the other day, if it was possible in Sharepoint to create a set of top level folders in one document library based on the set of folders in another library. One document library has a set of top level folders that is basically a client list and we needed to create the same top level folders in another library. I knew that it was possible to open a Sharepoint document library in explorer using a UNC style path and that you could map a drive using a technique like this one: http://www.endusersharepoint.com/2007/11/16/can-i-map-a-document-library-as-a-mapped-drive/. But while explorer would let us copy the folders, it would also take all of the folder contents too, which was not what we wanted. So I figured that some sort of PowerShell script was probably the way to go and it turned out to be even easier than I thought. The following script did it in one line, so I thought I would post it here in my "online memory". :) dir "\\sharepoint\client documents" | where {$_.PSIsContainer} | % {mkdir "\\sharepoint\admin documents\$($_.Name)"} I use "dir" to get a listing from the source folder, pipe it through "where" to get only objects that are folders and then do a foreach (using the % alias) and call "mkdir".

    Read the article

  • Merge two different API calls into One

    - by dhilipsiva
    I have two different apps in my django project. One is "comment" and an other one is "files". A comment might save some file attached to it. The current way of creating a comment with attachments is by making two API calls. First one creates an actual comment and replies with the comment ID which serves as foreign key for the Files. Then for each file, a new request is made with the comment ID. Please note that file is a generic app, that can be used with other apps too. What is the cleanest way of making this into one API call? I want to have this as a single API call because I am in a situation where I need to send user an email with all the files as attachment when a comment is made. I know Queueing is the ideal way to do it. But I don't have the liberty to add queing to our stack now. So this was the only way I could think of.

    Read the article

  • Dual monitors with one above the other?

    - by Felix
    I'm using Gnome 3 and proprietary Nvidia drivers. I have tried to set in nvidia-settings my external monitor to be "above" my main one (it's a laptop). However, when I try to drag a window up from the main display to the external one, it gets stuck and can't move past a certain point. Trying to maximize it changes its decoration so it looks maximized (i.e. no borders, etc), but its size or position doesn't change. Now, if I set my external monitor to be "to the left" of the main one, it works, which is why I'm suspecting this is a Gnome issue, not an Nvidia one. Anyone know how to fix this? Update: some versions: Gnome: 3.2.2.1 Nvidia: 280.13 Update 2: I can see that Gnome 3.4 is out, and among the release notes is better external monitor support. However, they only mention a small fix that is unrelated to my problem. Can anyone with Gnome 3.4 and access to an external monitor please test this out and tell me if it works? I don't want to go through the hassle of upgrading my Ubuntu installation unless I know for certain it's going to fix the problem.

    Read the article

  • Complex shading using one single (small) texture

    - by teodron
    Recently I stumbled upon a demo reel in UDK about how one can attain beautiful results using just one (rather tiny) texture that's being sent to the shader pipeline. The famous link is this one. Basically, the author states that they've used just one texture and give a snapshot of the technique here. I see that every RGBA channel contains different grayscale information.. and that info could be used to inside a shader to obtain a colour blended output. The problem is that the reel displays a fairly complex scene. To top that, the author even makes use of a normal map. How did they manage to fit a normal map in an already cluttered texture? It makes sense to have a half-space normal map by using only RG from an RGB texture, but what about the rest of the information? Since it was proven to be possible, could someone please explain how it was done (the big picture, not the dirty details!)!? Here's the texture being used. Click to see in full size.

    Read the article

  • Azure, don't give me multiple VMs, give me one elastic VM

    - by FransBouma
    Yesterday, Microsoft revealed new major features for Windows Azure (see ScottGu's post). It all looks shiny and great, but after reading most of the material describing the new features, I still find the overall idea behind all of it flawed: why should I care on how much VMs my web app runs? Isn't that a problem to solve for the Windows Azure engineers / software? And what if I need the file system, why can't I simply get a virtual filesystem ? To illustrate my point, let's use a real example: a product website with a customer system/database and next to it a support site with accompanying database. Both are written in .NET, using ASP.NET and use a SQL Server database each. The product website offers files to download by customers, very simple. You have a couple of options to host these websites: Buy a server, place it in a rack at an ISP and run the sites on that server Use 'shared hosting' with an ISP, which means your sites' appdomains are running on the same machine, as well as the files stored, and the databases are hosted in the same server as the other shared databases. Hire a VM, install your OS of choice at an ISP, and host the sites on that VM, basically the same as the first option, except you don't have a physical server At some cloud-vendor, either host the sites 'shared' or in a VM. See above. With all of those options, scalability is a problem, even the cloud-based ones, though not due to the same reasons: The physical server solution has the obvious problem that if you need more power, you need to buy a bigger server or more servers which requires you to add replication and other overhead Shared hosting solutions are almost always capped on memory usage / traffic and database size: if your sites get too big, you have to move out of the shared hosting environment and start over with one of the other solutions The VM solution, be it a VM at an ISP or 'in the cloud' at e.g. Windows Azure or Amazon, in theory allows scaling out by simply instantiating more VMs, however that too introduces the same overhead problems as with the physical servers: suddenly more than 1 instance runs your sites. If a cloud vendor offers its services in the form of VMs, you won't gain much over having a VM at some ISP: the main problems you have to work around are still there: when you spin up more than one VM, your application must be completely stateless at any moment, including the DB sub system, because what's in memory in instance 1 might not be in memory in instance 2. This might sounds trivial but it's not. A lot of the websites out there started rather small: they were perfectly runnable on a single machine with normal memory and CPU power. After all, you don't need a big machine to run a website with even thousands of users a day. Moving these sites to a multi-VM environment will cause a problem: all the in-memory state they use, all the multi-page transitions they use while keeping state across the transition, they can't do that anymore like they did that on a single machine: state is something of the past, you have to store every byte of state in either a DB or in a viewstate or in a cookie somewhere so with the next request, all state information is available through the request, as nothing is kept in-memory. Our example uses a bunch of files in a file system. Using multiple VMs will require that these files move to a cloud storage system which is mounted in each VM so we don't have to store the files on each VM. This might require different file paths, but this change should be minor. What's perhaps less minor is the maintenance procedure in place on the new type of cloud storage used: instead of ftp-ing into a VM, you might have to update the files using different ways / tools. All in all this makes moving an existing website which was written for an environment that's based around a VM (namely .NET with its CLR) overly cumbersome and problematic: it forces you to refactor your website system to be able to be used 'in the cloud', which is caused by the limited way how e.g. Windows Azure offers its cloud services: in blocks of VMs. Offer a scalable, flexible VM which extends with my needs Instead, cloud vendors should offer simply one VM to me. On that VM I run the websites, store my DB and my files. As it's a virtual machine, how this machine is actually ran on physical hardware (e.g. partitioned), I don't care, as that's the problem for the cloud vendor to solve. If I need more resources, e.g. I have more traffic to my server, way more visitors per day, the VM stretches, like I bought a bigger box. This frees me from the problem which comes with multiple VMs: I don't have any refactoring to do at all: I can simply build my website as if it runs on my local hardware server, upload it to the VM offered by the cloud vendor, install it on the VM and I'm done. "But that might require changes to windows!" Yes, but Microsoft is Windows. Windows Azure is their service, they can make whatever change to what they offer to make it look like it's windows. Yet, they're stuck, like Amazon, in thinking in VMs, which forces developers to 'think ahead' and gamble whether they would need to migrate to a cloud with multiple VMs in the future or not. Which comes down to: gamble whether they should invest time in code / architecture which they might never need. (YAGNI anyone?) So the VM we're talking about, is that a low-level VM which runs a guest OS, or is that VM a different kind of VM? The flexible VM: .NET's CLR ? My example websites are ASP.NET based, which means they run inside a .NET appdomain, on the .NET CLR, which is a VM. The only physical OS resource the sites need is the file system, however this too is accessed through .NET. In short: all the websites see is what .NET allows the websites to see, the world as the websites know it is what .NET shows them and lets them access. How the .NET appdomain is run physically, that's the concern of .NET, not mine. This begs the question why Windows Azure doesn't offer virtual appdomains? Or better: .NET environments which look like one machine but could be physically multiple machines. In such an environment, no change has to be made to the websites to migrate them from a local machine or own server to the cloud to get proper scaling: the .NET VM will simply scale with the need: more memory needed, more CPU power needed, it stretches. What it offers to the application running inside the appdomain is simply increasing, but not fragmented: all resources are available to the application: this means that the problem of how to scale is back to where it should be: with the cloud vendor. "Yeah, great, but what about the databases?" The .NET application communicates with the database server through a .NET ADO.NET provider. Where the database is located is not a problem of the appdomain: the ADO.NET provider has to solve that. I.o.w.: we can host the databases in an environment which offers itself as a single resource and is accessible through one connection string without replication overhead on the outside, and use that environment inside the .NET VM as if it was a single DB. But what about memory replication and other problems? This environment isn't simple, at least not for the cloud vendor. But it is simple for the customer who wants to run his sites in that cloud: no work needed. No refactoring needed of existing code. Upload it, run it. Perhaps I'm dreaming and what I described above isn't possible. Yet, I think if cloud vendors don't move into that direction, what they're offering isn't interesting: it doesn't solve a problem at all, it simply offers a way to instantiate more VMs with the guest OS of choice at the cost of me needing to refactor my website code so it can run in the straight jacket form factor dictated by the cloud vendor. Let's not kid ourselves here: most of us developers will never build a website which needs a truck load of VMs to run it: almost all websites created by developers can run on just a few VMs at most. Yet, the most expensive change is right at the start: moving from one to two VMs. As soon as you have refactored your website code to run across multiple VMs, adding another one is just as easy as clicking a mouse button. But that first step, that's the problem here and as it's right there at the beginning of scaling the website, it's particularly strange that cloud vendors refuse to solve that problem and leave it to the developers to solve that. Which makes migrating 'to the cloud' particularly expensive.

    Read the article

  • Nexus One Guys…Android 2.3 update comming your way

    - by Boonei
    Good News ! If you are a nexus one customer, Google said on its tweet “The Gingerbread OTA for Nexus One will happen in the coming weeks. Just hang tight!” Non-Nexus owners have to wait much much longer. Don’t know when their phone maker and operator will roll out the same. This article titled,Nexus One Guys…Android 2.3 update comming your way, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Deleting old tomcat version and setting a new one

    - by Diego
    I had Apache Tomcat installed by apt-get, however I decided to get a newer one, performed apt-get remove tomcat7 and apt-get purge tomcat7. I installed a newer one means the bundled Tomcat Server in NetBeans install. However, Im still seeing the old fashioned page from former Tomcat install: It works ! If you're seeing this page via a web browser, it means you've setup Tomcat successfully. Congratulations! This is the default Tomcat home page. It can be found on the local filesystem at: /var/lib/tomcat7/webapps/ROOT/index.html I already set a different port in the server.xml file and whenever I go that site after executing the startup.sh file with sudo permissions I'm not getting any site like server (new one) isn't running. How can I still be getting the page from old Tomcat install!? When I execute the startup.sh log says all is set OK, so why isn't it working?

    Read the article

  • How to Share Files Online with Ubuntu One

    - by Chris Hoffman
    Ubuntu One, Ubuntu’s built-in cloud file storage service, allows you to make files publically available online or share them privately with others. You can share files over the Internet right from Ubuntu’s file browser. Ubuntu One has two file-sharing methods: Publish, which makes a file publically available on the web to anyone who knows its address, and Share, which shares a folder with other Ubuntu One users. HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me? HTG Explains: How Windows 8′s Secure Boot Feature Works & What It Means for Linux

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >