Search Results

Search found 47506 results on 1901 pages for 'system sounds'.

Page 377/1901 | < Previous Page | 373 374 375 376 377 378 379 380 381 382 383 384  | Next Page >

  • Does using a hexacore CPU make sense?

    - by Exa
    I'm currently planning to upgrade my computer system and I want to exchange CPU, board and RAM. I already had a look at some hexacore-CPUs from AMD and would like to know if it makes any sense to use such a CPU with six cores. Is there any software which really uses six cores? Especially in gaming? I'm using this PC mostly for gaming and from time to time for developing. I know that on the dual-core system (2 x 3GHz) I currently use, Visual Studio creates two instances of the compiler, one for each core. Would there be six instances of the compiler on a hexacore system for super fast compiling? Is there any software that uses six cores? Would running two applications cause the usage of more CPUs? (For example two CPUs for a game you're playing while two other CPUs are used for compiling at the same time) I hope someone can point out the benefits of a hexacore system. The OS would be Windows 7 64 Bit and I use the PC for gaming most of the time. (Crysis 2, CoD, stuff like that)

    Read the article

  • Sharing between Vista and Windows 7

    - by Metro Smurf
    Vista Ultimate 32 bit Windows 7 Ultimate 64 bit I've read through similar questions about sharing between Win7 and Vista, but none of them have resolved my issue of not being able to share between Win7 and Vista: Connecting to a Vista shared folder from Windows 7 Networking Windows 7 and Vista Enable File sharing in Windows Vista Previously I had previously had my Vista and XP system sharing back and forth without any problems. I was able to access the shares without entering a user name / password in the NT challenge prompt (note: account names and passwords were different on the Vista and XP systems). Currently I replaced my XP system with a Win7 system. Now, when I attempt to access shares to/from Vista / Win7, I am continually prompted with an NT challenge to enter my credentials. Things I've Verified/Tried Both systems are on the same workgroup. Win 7 is using the Home network. Vista is using the Private network. In other words, neither system is using a Public network profile. Enabled file sharing with and without password protection on both Vista and Win7 Tried HomeGroup Connections (win7) with Windows to manage connections and Use user accounts to connect. Reviewed too many online articles to count to trouble shoot. Set the shares to have full control by everyone. Set up the shares directly on the directory and through the share manager. My Question How can I enable file sharing between Vista and Win7 without being prompted with a username/password challenge, ever?

    Read the article

  • My yum repository able to search packages, but not able to install it in RHEL?

    - by mandy
    I set up yum from dvd. Following is the containts of my .repo file: [dvd] name=Red Hat Enterprise Linux Installation DVD baseurl=file:///media/dvd enabled=0. I'm able to search packages. However while installation I'm getting below error: [root@localhost dvd]# yum install libstdc++.x86_64 Loaded plugins: rhnplugin, security This system is not registered with RHN. RHN support will be disabled. Setting up Install Process Nothing to do My Yum Search output: [root@localhost dvd]# yum search gcc Loaded plugins: rhnplugin, security This system is not registered with RHN. RHN support will be disabled. ============================================================================= Matched: gcc ============================================================================= compat-libgcc-296.i386 : Compatibility 2.96-RH libgcc library compat-libstdc++-296.i386 : Compatibility 2.96-RH standard C++ libraries compat-libstdc++-33.i386 : Compatibility standard C++ libraries compat-libstdc++-33.x86_64 : Compatibility standard C++ libraries cpp.x86_64 : The C Preprocessor. libgcc.i386 : GCC version 4.1 shared support library libgcc.x86_64 : GCC version 4.1 shared support library libgcj.i386 : Java runtime library for gcc libgcj.x86_64 : Java runtime library for gcc libstdc++.i386 : GNU Standard C++ Library libstdc++.x86_64 : GNU Standard C++ Library libtermcap.i386 : A basic system library for accessing the termcap database. libtermcap.x86_64 : A basic system library for accessing the termcap database. Please guide me on this, I want to install gcc on my RHEL.

    Read the article

  • Openfire: Granular alerts

    - by R.S.
    Our organization has had an Openfire server up and running for about a year now. So far we have used it for messaging in the I.T. Dept and Alerts to all users. We hit a snag this week when one system went down and several notifications were sent out to inform users of progress. Some of the users were Radiologists that do not use the particular system in question and these users found it more of an annoyance than informative. Since that I have been tasked with finding a more granular system for alerts. I am confident that Openfire can handle this and I have just about settled on a way of getting this to work. My idea is to create a half dozen or so users. For example: Staff, Doctor, Assitant and Supervisor. Using spark as our messenger has worked great so far so I would like to stick with that if possible. With that in mind, under advanced login features the resource name can be changed to something unique and non-unique users can log in under the same account, however, when a message is sent to one of these users, the message delivery is inconsistent. Currently I have 4 users under the Assistant user and it seems only 1 of the users receives the messages. Is this scenario even possible? I am avoiding working with the groups in Openfire because the function is atrocious. I could possibly integrate the system into our Active directory but I don’t think that will get us to a workable solution any quicker or more efficiently.

    Read the article

  • Server 2008R2 Server Manager Roles and Features won't refresh or allow addition of new roles or features

    - by MattChorba
    I have a standalone DC in an isolated lab. I have installed the SUR tool and found no errors. I ran SFC and found no errors. I have attempted to install Windows Backup feature using Powershell, but received the same error about the computer needing to be restarted. Powershell cmdlets will list all of the installed roles and features. The rest of Server Manager works without problems. What can I do to get Server Manager Roles and Features working properly again? Picture of Error: CheckSUR.log: ================================= Checking System Update Readiness. Binary Version 6.1.7601.21645 Package Version 13.0 2011-11-28 13:20 Checking Windows Servicing Packages Checking Package Manifests and Catalogs Checking Package Watchlist Checking Component Watchlist Checking Packages Checking Component Store Summary: Seconds executed: 413 No errors detected (w) Unable to get system disk properties 0x0000045D IOCTL_STORAGE_QUERY_PROPERTY Disk Cache CheckSUR.persist.log: ================================= Checking System Update Readiness. Binary Version 6.1.7601.21645 Package Version 13.0 2011-11-28 13:20 Checking Windows Servicing Packages Checking Package Manifests and Catalogs Checking Package Watchlist Checking Component Watchlist Checking Packages Checking Component Store Summary: Seconds executed: 413 No errors detected (w) Unable to get system disk properties 0x0000045D IOCTL_STORAGE_QUERY_PROPERTY Disk Cache

    Read the article

  • Troubles Installing Windows 7 via USB. Flat install?

    - by Brian
    Hi friends, I've been struggling with this for a while. Windows 7 64-bit Enterprise edition just will not install on my Shuttle K45 system via a USB key. It hangs out during the install while copying files or while creating the partitions. The system is pretty standard and low-tech: IDE hard drives, no CD, 2G RAM. I am not sure what of the problem. Other than the Shuttle, I have a Apple MacBook Pro. On the MPB, I am running OS X, and Mint Linux and Window XP over Parallels. I have an ISO of Win7 that works (I installed it as a Parallels VM to make sure). I have used UltraISO and MS Windows 7 USB/DVD Download Tool to write it to the 8G USB key. Both seem to copy all the files correctly (with UltraISO, I asked it to verify). It boots via USB and the install looks just fine. Until it hangs, most of the time with a copying error of 0x80070241. So now I am trying to figure out if there are other ways I can install Windows 7 on this Shuttle system that has no CD drive. I've heard about a flat installation, however those all seem to be doing something from within Windows. I do have access to a command prompt from the Windows 7 install. Does anyone know if/how I can prep the Shuttle hard drive with Windows 7 installation and have Windows 7 install from the hard disk. I do not have an external enclosure for the IDE HD and I do not have any other system I can hook up to the hard drives. I do have an external Maxtor OneTouch drive available.

    Read the article

  • What are good systems for managing PHP/MySQL infrastructure?

    - by sbrattla
    I work in a company which is about to migrate most applications from in-house custom built Java/Tomcat applications to Drupal. Due to company policies, applications and websites need to run on in-house servers. This means that we need infrastructure for Drupal (PHP/MySQL) applications. This must have been solved a million times already. I believe this is what web-hosting companies does every day. Even though we work on a much smaller scale than web-hosting companies, i assume it would make sense to look at the task as if we're going to have an internal small-scale web-hosting company. This means that the guys in IT operations could be "responsible" for "offering" web-hosting, while developers could use these "services". We have three environments; dev(elopment), test and prod(uction). It would make sense that developers could log in to a system and create/edit/delete dev and test sites as they'd like. Production sites should be available through the same system, but only available to IT ops. We need to work with clusters of web servers, meaning that an administration system should be capable of creating/editing/deleting sites across multiple servers. I know there's no "this is it" answer to my question; but what would be a good place to start to get going with this? Apart from the actual hardware, what would be a good administration system for this?

    Read the article

  • No audio with streaming video

    - by Chris Barnhill
    I am having trouble with audio when playing streaming videos. My sound card is fine. I know this because if I play sounds from my local machine, there's no problem. It's only when I try to play sounds from the internet that I lose audio. This only started happening recently when I did 2 things: I connected a USB headphone/microphone set to record screencasts I began recording/publishing screencasts from screenr.com. I have tried playing video both with the headset connected and without it connected: it makes no difference. If I record a screencast on screenr.com and preview it, I hear the audio. But once I publish is and play it, there is no audio. I also hear no audio with YouTube videos. I really hope someone can help. Thanks. The latest is that the problem went away after I powered my system off and on. A reboot didn't do it, I had to actually shut down the power.

    Read the article

  • Computer freezing while watching Flash videos from net

    - by t3st
    I have a Windows 7 home basic, while watching videos from net (within 5 minutes) computer starts to freeze and shows 100 percent CPU usage. I first thought it's a browser issue but watching videos from different browsers also has the same issue. My system runs the latest Firefox browser and all my plugins (including Flash) is up-to-date. After this when I shutdown/restart the computer it will go to the login window with out any problem. From there when I tried to log in to any account, the system starts to freeze and again I have to start and run Windows in safe mode (which doesn't show any problem). I read it in an article to do these steps CMD->sfc/scannow chkdsk after that only my system works normally, even now I can't watch any videos on the net otherwise it starts freezing( I can watch downloaded videos in computer without a problem) and have to do the whole process once more (which takes a lot of time). while running sfc/scannow its showed results that some of the files are corrupted and it could not be repaired. Can this be the cause for freezing of my computer while running Flash videos? or is it a hardware related problem? What different steps do I have to take to correct those corrupt files? System restore works only some times.

    Read the article

  • Proper approach to debug PC startup problems (POST)

    - by saurabhj
    My CPU was heating up to around 65 deg C and last time this had happened (about a year ago), I got thermal paste put between the CPU and heat sink and this managed to get it down to about 45 - 50 degrees. This time, I got some thermal paste and put it myself. However, my PC is not showing the POST display and not starting up. This is what happens LEDs light up HDDs spin Mouse is getting power All fans including the processor fan starts No display on monitor No diagnostic beep sounds (no sounds at all) What I have tried Removing everything including RAM, HDD, PCI cards, AGP card Boot up machine No changes from first state. What steps can I take to figure out where the problem lies? Note (might be important) When I removed the heat sink, the processor came out with it (it was stuck to it inspite of the processor latch on) Had to pry it separate with a screw-driver. Configuration Pentium 4, 2.8 Ghz with HT (very old, I know) Original Intel Mobo with onboard sound and graphics (GB series) 2x512 Mb DDR-RAM 2 SATA disks (320 Gigs / 250 gigs) DVD Writer Creative Sound Card Network card Any help would be appreciated. Thanks!

    Read the article

  • What does Firefox do when "scanning for viruses" after download?

    - by Joey
    Never mind the fact that Firefox is a browser and not a AV tool, but what exactly does it do after a download? Even on systems that have an up-to-date AV this generates a pause of several seconds after download (where I can't open the file from within the DL manager) and I have no idea what FF might be trying there. I know I can turn it off (using FF only at work anyway) but I'm wondering. I can think of some things here what it might be: FF itself is a AV scanner and it loads signatures in the background and whatnot. Sounds highly unlikely and shouldn't need tens of seconds for 20 KiB files. FF tries to talk with the installed AV to munch the file. Sounds unneeded, given that most AV programs feature real-time protection anyway and therefore will have caught a virus already and also because FF does that on systems without AV installed too. FF uploads the file to some online virus checker. Unlikely and stupid. FF instructs some online virus checker to download the file and check it. Unlikely and would be a nice target for DoSing that service. FF generates a hash of the file and sends that somewhere (presumably Google) to check for. They then respond with either "Whoa, that hash is totally a virus" or "Nope, that MD5 doesn't look very virus-y to me". I'm running out of better ideas. Anyone have a clue?

    Read the article

  • Broken Vista. Can't open Windows settings.

    - by serena
    My neighbor has a Lenovo laptop with Windows Vista Home Basic. She's a noob and just uses the laptop for internet purposes. She said she had to close down Windows improperly (sometime ago, maybe 6 months) because of system freeze. She realized there's something wrong with her Windows when she tried to open Windows update settings. I took a look at the system and determined the following errors: When I click on Windows Updates, a bare white window opens for a sec. and closes immediately. When I try to open Computer Properties, the same thing happens. (Windows+Break doesn't work either.) When I try to open Bluetooth settings, the same thing happens. So Vista won't let me open any Windows settings, but installed programs work correctly (games, applications etc.). She has no Windows Vista discs since the laptop came with preinstalled genuine Vista. She also has no recovery discs. I don't think there is a system restore point for the time the system was stable. Now what can we do to solve this big problem?

    Read the article

  • How to determine if my AWS/EC2 server has been compromised / resolution?

    - by ElHaix
    I have recently seen an increase in network in/out activity on my server and am trying to determine if my AWS/EC2 instance has been compromised, and if so, how to resolve? In my security group I have: Inbound: 80 (HTTP) 0.0.0.0/0 Outbound: 80 (HTTP) 0.0.0.0/0 443 (HTTPS) 0.0.0.0/0 Using TCP-UDP Endpoint Viewer: I see a lot of w3wp.exe TCP processes with varying local ports http and numbered, as well as varying remote ports. Some processes go red/yellow/green on updates . I see Remote address for most w3wp processes are my ec2 instance, however I am seeing several to *.deploy.akamaitechnologies.com and *.deploy.static.akamaitechnologies.com with received bytes varying between 4-11 megs. I also see Ec2Config.exe, remote address: 169.254.169.254 System Process Remote Address: fetcher4-4.p.mail.ru (how can I get rid of this one?!) local port: http remote port: 33432 I am also seeing some system processes from 114.216-244-93-rdns.wowrack.com: Protocol: TCP local port: http remote port: varying As well as some baiduspider "System Process"'s. I'm afraid that my system may have been compromised, and wondering if these results are any indication of that. If so, how can I get eliminate these possible threats? I have MS Security Essentials installed.

    Read the article

  • HD working with IDE USB adapter but not recognised by bios

    - by Rajeeva
    I have a Windows XP Pentium III desktop with two hard drives. The first one has the OS and is luckily working. The second drive on the secondary master IDE channel few days back was unable to read some files and since then for some time it was failing and reviving intermittently and now it is always showing as failed on the IDE channel When the HD was intermittenly failing, I was able to copy some data from it to the other drive - also during that time if the system was running and the hard disk failed at that time, the system froze and then i had to reboot. then I got a new 80 gb hdd similar (same make - seagate barracuda) to the earlier failing one, a new data cable for the drive and an IDE to USB adapter. the new hard drive i installed in the previous drive's place (secondary master), formatted it and it worked for 1 day and then it also failed - simultaneously i connected the old hd to the IDE/USB adapter and i could view all the data - some of that data i was able to back up from the old hd to the new hd before the new hd failed the new hd i have tried connecting on the primary channel as the slave disk but when i do that then the bios does not detect either the OS drive or the new drive and the system does not boot surprisingly, the older (previously failed) hd and the new hd are both working fine on the usb channel with the IDE/USB adapter. i have ruled out any problem with the secondary channel since the dvd rom i was earlier using as primary slave have now connected to secondary master and it works fine. i am really confused by this behavior on my system. please can anybody try to solve this for me. thanks.

    Read the article

  • Managed LAMP platform for maximizing availability and global reach, not scalability

    - by user66819
    Assume a Linux/Apache/MySQL/PHP application for a small base of registered users. With small userbase, there are no traffic peaks so the scalability that cloud platforms offer is not imperative. But the system is mission-critical, so availability is the primary goal. Users are also distributed across Asia, Europe, and US, so multiple server locations that minimize users' network hops would be highly desirable. The dream: a managed VPS platform where we would configure a single server (uploading PHP and other files, manipulating database, etc.), and the platform would automatically mirror the server in a handful of key places around the world (say one on each US coast, one in Europe, one in east Asia). File system synchronization and MySQL replication would happen automatically. Core operating system is managed, so we don't need to do full system administration and security, and low-level backups are also done by service provider, though we also do our own backups as well. Couple this with some sort of DNS geo-detection, so users are routed to the nearest operational server... with support for https, of course. Does such a dream exist? If not, what are some approaches to accomplish the same end with minimal time investment and minimal monthly hosting costs?

    Read the article

  • REMOTE_USER not getting set?

    - by landed
    I am trying to setup LDAP Authentication in Joomla using a plugin called JMapMyLDAP (in fact 4 plugins each doing a different job). I need to pull a part of a string out of the server variable REMOTE_USER and this should be visible (we see here http://timplummer.com.au/4-how-to-integrate-joomla-3-with-active-directory-using-ldap.html) in phpinfo(); The issue is that REMOTE_USER is not set or at least not appearing. A few things to note (if you don't mind) here- conceptually I am not really understanding authentication as a whole subject it appears to be vast despite my years working in websites. Yes I used asp and built php pages to check a user is who they say they are with a token(/session?) that was given to just them and then they are identified when a stateless request is made to the server. Thats my level of understanding. This sounds different to the basic authentication in apache where a password sits in a file and a username and the user needs to login to a basic form to get access to the folder/docs this is via an .htaccess file. Ok so with the LDAP to work I need to get REMOTE_USER this sounds very reasonable as how else do we know is making the request. Thank you.

    Read the article

  • Ubuntu 13.10 - How to disable LVM and cryptsetup? cryptsetup: evms_activate is not available

    - by NeverEndingQueue
    I am trying to remove whole drive encryption from my Ubuntu installation. I've run Ubuntu from Live CD, mounted crypt partition and copied it to another partition /dev/sda3. sudo cryptsetup luksOpen /dev/sda5 crypt1 sudo dd if=/dev/ubuntu-vg/root of=/dev/sda3 bs=1M After that I've run boot-repair: https://help.ubuntu.com/community/Boot-Repair Added entry to /etc/fstab: UUID=<uuid> / ext4 errors=remount-ro 0 1 Of course I've replaced with blkid result of my /dev/sda3. I've also deleted overlayfs and tmpfs lines from /etc/fstab. (I've just compared it to content of /etc/fstab in non-encrypted Ubuntu installation and could not find overlayfs and tmpfs). I've chrooted from LiveCD into my system and rebuilt initramfs: http://blog.leenix.co.uk/2012/07/evmsactivate-is-not-available-on-boot.html I've also removed cryptsetup using apt-get remove. Basically I can easily mount my system partition from Live CD (without setting up the encryption and LVM stuff), but can not boot from it. Instead I see: cryptsetup: evms_activate is not available When I've chosen the Recovery mode I've seen this: Begin: Mounting root file system ... Begin: Running /script/local-top ... Reading all physical volumes. This may take a while ... No volume groups found cryptsetup: evms_activate is not available Begin: Waiting for encrytpted source device ... My /etc/crypttab is empty. I am pretty sure that system tries to find encrypted partition, search for LVMs etc. Do you have ideas what could be the problem or how can I fix it? Thanks

    Read the article

  • What does this error mean in my IIS7 Failed Request Tracing report?

    - by Pure.Krome
    when I attempt to goto any page in my web application (i'm migrating the code from an asp.net web site to web application, and now testing it) .. i keep getting some not authenticated error(s) . So, i've turned on FREB and this is what it says... I'm not sure what that means? Secondly, i've also made sure that my site (or at least the default document which has been setup to be default.aspx) has anonymous on and the rest off. Proof: - C:\Windows\System32\inetsrv>appcmd list config "My Web App/default.aspx" -section:anonymousAuthentication <system.webServer> <security> <authentication> <anonymousAuthentication enabled="true" userName="IUSR" /> </authentication> </security> </system.webServer> C:\Windows\System32\inetsrv>appcmd list config "My Web App" -section:anonymousAuthentication <system.webServer> <security> <authentication> <anonymousAuthentication enabled="true" userName="IUSR" /> </authentication> </security> </system.webServer> Can someone please help?

    Read the article

  • Provider claiming "all web servers in the cloud are automatically kept in sync" - should I be skeptical?

    - by RobMasters
    I'm no expert in cloud computing - I've spent a fair bit of time researching it and various providers but am yet to get any hands-on experience with it. From what I've read about AWS and auto-scaling EC2 instances though, it seems as though each instance should be completely decoupled from all other instances. i.e. If content is uploaded to the web server's local filesystem from a custom CMS backend then that content won't be available if subsequently requested from a different web server in the auto-scaling group. Is that right? I met with a representative of our existing hosting provider recently and he was claiming that it isn't a problem that our legacy CMS system is highly dependent on having a local filesystem. He said that all web servers, regardless of how many, would be kept as exact duplicates so I shouldn't notice any difference compared to our existing setup of a single dedicated server. This smells a little too much like bull fecal-matter to me...should I be skeptical about this? I'm a little worried because my (non-technical) boss who ultimately makes the decisions is all for signing up to this cloud solution because it won't require any extra work. I'm sure that they must at least be able to provide this, otherwise they wouldn't be attempting to sell it to us. But at what cost? It sounds as though each web server will always need to be checking the other web server(s) for new static content, which to me sounds like unwanted overhead that'll slow things down. I'd really appreciate it if somebody could clear this up to me. I'm all for switching to AWS and using S3+CloudFront for all static content, but that isn't looking very likely to happen at the moment.

    Read the article

  • Nvidia RAID 1 Problem. Degraded drives...

    - by Vedat Kursun
    I had a RAID 1 on my system which has a Gigabyte GA 8N SLI motherboard with a Nvidia chipset.(Nvidia Raid IDE ROM BIOS 4.84) When the system was working probably there used to be an icon on the system try which showed my two RAID disks. Bu after my friend accidentally clicked on the "Remove drive safely" icon while trying to disconnect her USB, I noticed that the RAID system wasn't working. After a reboot there was suddenly a failure message during boot screen. When I enter the Nvidia RAID setup utility (F10) I can see that both drives are degraded and that won't change even if I get into them and press R for Rebuild. Other options are only Delete and Exit. When I boot to Windows (XP Pro 32 Bit) I can see both my disks with the same data on each of them but my RAID 1 is broken. It's a relief to see that at least my RAID 1 was active but it's annoying not being able to rebuild it. Is there a way where I can rebuild my RAID 1 without having to delete the array and build it again? Cause I don't want to backup 400 Gigs of data and then recopy it to my drives... (Disks 2 x Seagate ST3500418 AS SATA Drives)

    Read the article

  • Nvidia RAID 1 Problem. Degraded drives...

    - by Vedat Kursun
    I had a RAID 1 on my system which has a Gigabyte GA 8N SLI motherboard with a Nvidia chipset.(Nvidia Raid IDE ROM BIOS 4.84) When the system was working probably there used to be an icon on the system try which showed my two RAID disks. Bu after my friend accidentally clicked on the "Remove drive safely" icon while trying to disconnect her USB, I noticed that the RAID system wasn't working. After a reboot there was suddenly a failure message during boot screen. When I enter the Nvidia RAID setup utility (F10) I can see that both drives are degraded and that won't change even if I get into them and press R for Rebuild. Other options are only Delete and Exit. When I boot to Windows (XP Pro 32 Bit) I can see both my disks with the same data on each of them but my RAID 1 is broken. It's a relief to see that at least my RAID 1 was active but it's annoying not being able to rebuild it. Is there a way where I can rebuild my RAID 1 without having to delete the array and build it again? Cause I don't want to backup 400 Gigs of data and then recopy it to my drives... (Disks 2 x Seagate ST3500418 AS SATA Drives)

    Read the article

  • Server 2008 R2 boot is at 2 hours and counting. What now?

    - by Jesse
    This morning, we rebooted our Server 2008 R2 box. No problem, came right back up. Then we shut it down and let it install windows updates. While it was off, we added some RAM. Then we turned it back on. The system came right back up to the "press ctrl-alt-delete" screen, so far, so good. I logged in. The system got as far as "Applying Group Policy" -- then spent almost an hour applying drive mappings. Finally finished that, and has now spent 30 minutes on waiting for the Event Notification Service. I still haven't been able to log in. Remote desktop service doesn't appear to be running yet. I tried viewing the event log from another machine. I see that the box is writing to the Security log, but there are no events in System or Application in the last 45 minutes. Digging through the System log of events from 45 minutes ago, I see a bunch of timeouts: A timeout (30000 milliseconds) was reached while waiting for a transaction response from the ShellHWDetection service. [lots of these] A timeout (30000 milliseconds) was reached while waiting for a transaction response from the wuauserv service. A timeout (30000 milliseconds) was reached while waiting for a transaction response from the SessionEnv service. A timeout (30000 milliseconds) was reached while waiting for a transaction response from the Schedule service. A timeout (30000 milliseconds) was reached while waiting for a transaction response from the CertPropSvc service. What can I do? Should I try shutting it down remotely, or will that do more damage?

    Read the article

  • Basics about file/folder permissions on Win 7

    - by Altar
    Hi. Under Win XP I never touched the permissions of a file/folder. I was happy with the way it worked. But recently I installed Windows 7 on a drive that previously hosted Windows XP. Now, some programs do not have 'read' and/or 'write' access to their own folders - and I am not talking about system folders like 'Program Files' but normal folders like 'C:\my data\my own folder\program folder'. I see that for folders created under Win XP I have some user groups that do not exist for 'normal' folders (folders created by me recently under Windows 7). For example, for the Win XP folder I have: Creator owner System Account unknown(S-1-5-21 blablabla... Admins Users For Win7 folders I have: Authenticated users System Admins Users How should I proceed? Should I give the right to the "Users" account to write to XP folders? Should I make the old folders (the XP folders) to have the same groups of users as the normal (Win7) ones by adding the "Authenticated users" account to those folders? Should I delete the "Account unknown" account from my system? (In this case, how?) Many thanks.

    Read the article

  • How to configure a Web.Config file to allow custom 404 handling while still displaying on-page 500 error detail?

    - by Mark
    To customize 404 handling and based on the hosting company's suggestion, we are currently using the following web.config setup. However, we quickly realized that with this configuration, any page error (500 error) are also getting redirected to this custom error page. How can I modify this config file so we can continue to handle 404 with custom file while still able to view on-page error? <?xml version="1.0" encoding="utf-8" ?> <configuration> <system.webServer> <httpErrors errorMode="DetailedLocalOnly" defaultPath="/Custom404.html" defaultResponseMode="ExecuteURL"> <remove statusCode="404" subStatusCode="-1" /> <error statusCode="404" prefixLanguageFilePath="" path="/Custom404.html" responseMode="ExecuteURL" /> </httpErrors> </system.webServer> <system.web> <customErrors mode="On"> <error statusCode="404" redirect="/Custom404.html" /> </customErrors> </system.web> </configuration>

    Read the article

  • Computer slow after installing 32GB RAM

    - by John Gilmore
    I'm currently running very large network simulations for my PhD research, for which I need lots of RAM. I have a Core i7 2600K processor with a Gigabyte GA-Z68AP-D3 motherboard, running Windows 7 professional 64bit. I bought the system with 8GB (2x4GB) DDR3 1600 MHz Corsair Vengeance RAM and the system ran like a dream. I'm planning to upscale my simulations so I removed the 2x4GB RAM and installed 4x8GB DDR 1600 MHz Corsair Vengeance RAM. When I rebooted the system, boot time was much longer than usual (10 mins just to get to login screen). After logging in, the whole system was unresponsive. I tried playing some games (Bioshock 2), but it was unplayable. I've not had this problem before and I have an ATI Radeon HD 5850 graphics card, so that's not the problem. The only thing that's changed is the RAM. I've looked through the specifications of Windows, my motherboard and my CPU and they all state that 32GB of RAM is supported. Does anyone have an idea of what's going on? Any help would be greatly appreciated.

    Read the article

< Previous Page | 373 374 375 376 377 378 379 380 381 382 383 384  | Next Page >