Search Results

Search found 11991 results on 480 pages for 'cedric copy'.

Page 373/480 | < Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >

  • Computer not finding hard drives on boot -sometimes-

    - by todd.pund
    Computer specs: Mobo: Gigabyte ultradurable 3 - GA-970A-UD3 Processor: First gen I7 3.2GHZ Ram: 8GB Kingston DDR3 1066 Video Card: EVGA NVidia GTX 460 1GB Hard Drive: 500MB 7200rpm x2 (can't remember brand, sorry I'm at work.) Last week my developer preview for Windows 8 ran out so I put my copy of windows 7 back on the computer. The computer at that point started suffering from frequent freezing and crashing. When I rebooted the computer sometimes it wouldn't find the system HD at all. When I looked at the post screen it seemed to show that it wasn't finding either of the HDs. Then yesterday when turning on the computer I just got GRUB as a message (not a GRUB prompt, just GRUB) I haven't had a dual boot of Linux for at least a year. I loaded windows 7 recovery console from the disk and ran: bootrec /fixboot bootrec /fixmbr Which did not help. At that point I just installed Ubuntu 13.04 over the windows 7 install and still received the GRUB post. I went into the BIOS and switched the Hard Drive priorities and then it loaded into Ubuntu fine. For several days everything was just hunky dory until I installed the Ubuntu version of Steam, install Portal and tried to run it. At that point the computer froze and after hard rebooting couldn't find the hard disks again. Then after restarting the system it loaded up fine again and no issues since. (I have not tried to launch portal again). My next thought is to remove the system hard drive and try to use the secondary as the master to see if the primary HD is bad. I'm sorry if this has been confusing, I'll answer any questions I can. Any thoughts?

    Read the article

  • How to export and import an user profile from one Quassel core to another?

    - by Zertrin
    I have been using Quassel as my bouncer for IRC for quite a long time now. We (a group of administrators of a small network) have set up a shared Quassel core with many users on the same core. But now I would like to export everything related to my user account from the Quassel database on this core, in order to re-import it later in another Quassel core on my own server. Unfortunately, while a feature for adding users has been implemented into Quassel, nothing is so far provided for either exporting or deleting an user. (if deleting-a-user feature was available, I could have made a copy of the current database, delete all the other users leaving only mine, and use this resulting database on my own server, while leaving the first one untouched on the shared server) Despite extensive research on the Internet on this subject, I've found so far no solution. I have to precise that the backend database for the core has been migrated from the default SQLite backend to a PosgreSQL backend as the database grew sensibly (over 1,5 GB for now). However I'd be glad to hear from any working solution (SQLite or PostgreSQL backend) describing a way to export the data related to a specific user profile and then re-import-it in a new Quasselcore database.

    Read the article

  • Would a PHP application benefit from being served from a RAM drive?

    - by Tom Marthenal
    I am in charge of hosting a PHP application that is large and slow, but easy to scale. The application is entirely static, with writable disk storage needed. We've profiled the application, and the main bottleneck appears to come from loading the application and not the work the application does. The application is not CPU-intensive, although it does use a fair amount of memory (think Magento). Currently we distribute it by having a series of servers with the same PHP files on their hard drive and a load balancer in front of them. Easy but expensive. I've been reading about RAM disks and the IO benefits they offer, and was wondering if they would be well-suited to PHP applications. Since PHP applications are loaded from disk for every request and often involve lots of different files (as opposed to being kept in memory like with a Java application), I would figure that disk performance can be a severe bottleneck. Would placing the PHP files on a RAM disk and using the mount point as Apache's document root offer performance benefits? A startup script could create the RAM drive and then copy the files (which are plain-text and small) from a permanent location to the temporary RAM drive. Does this make sense, or should I just trust the linux kernel to cache the appropriate files in memory by itself?

    Read the article

  • Notify user of message arrival in another mailbox

    - by Tim Alexander
    This is very similar to this question but has a few differences. Basically we have a user dealing with a conflict of interest case. To separate the mail from prying eyes (and the draconian routing system we have in place) the user has been granted access to a second conflicts mailbox that is only accessible to him via OWA. This has worked fine for years but now the user would like a notification to be sent to him when a message arrives in his conflicts mailbox. Initially I thought an Outlook rule would work but of course the client is never logged in so the Outlook rules are never processed. This led me to think that an Exchange Transport rule might work but the only options I can see are to Forward or Copy the message to another user. this would bypass the conflicts setup. All I really need is a notification and not the actual message to be sent. Is this at all possible with Exchange 2007? Or if not is there any thirdparty addition or workaround that anyone has come across?

    Read the article

  • Have a set a cgi scripts shared by multiple domains

    - by rpat
    Goal: Have multiple domains share a set of cgi(perl) scripts Environment: Apache 2.0 on a dedicated Cent OS server. (Apache configuration files generated by cPanel) I have dozens of domains on the dedicated server. The domains set up by cPanel under VirtualHost section. I have almost no knowledge of Apache. Most of what I do is taken care of by cPanel. I would like to put a set of scripts under one directory (perhaps under / or /opt ) and for each of the domains, under the individual cgi-bin, I would like to create a symbolic link to this common directory. This way I am hoping to avoid having to keep a copy of scripts for every domain. Since Apache config files are generated by cPanel, I would not like to manually make changes to those. Beside, I could mess things up. I see that cPanel recommends use of include files rather than changing the httpd.conf Perhaps I need to have the following of symbolic links enabled in the cgi-bin directory and allow the web server user execute the scripts not owned by it. May be I am making things more complicated than they are. I would be glad to use any other means to achieve my goal. Thanks in advance for your help. *I asked this on stackoverflow and some one suggested that I could ask this on serverfault.

    Read the article

  • Shortcuts located in "D:\Program Data\..." not working even though they're pointing to the right targets (Windows 7)

    - by Kevin
    I just made a fresh install of my windows 7 home premium using my laptop's recovery disks (HP Pavilion dv6-2151cl) using minimal settings. After install, I set up "Program Data" and "Users" to my D partition to save space changing the folders in the registry. Then I updated windows (including W7 SP1), and installed all other programs. After installing all other programs I noticed that the icons of all new programs (not included in the windows install) in "All Programs" had a blank sheet as icon and they don't do anything. Looked into "D:\Program Data\Microsoft\Windows\Start Menu\Programs" in the windows explorer and the same is true there. All the shortcuts in C: and "D:\Users..." work both in the "Windows Explorer" and "All Programs". Also I noticed that the shortcuts do display the right icons inside the "open" dialog boxes. And if I copy the shortcuts in "D:\Program Data..." to the desktop they also work as expected. I checked file association for .lnk and it was OK, but also tried the registry fixers for this file association and they had no effect. There are no missing programs that I can tell in the "All Programs" menu, the just don't do anything if they lay in "D:\Program Data...". Any thoughts on how to make Windows 7 treat shortcuts in "D:\Program Data..." as they should?

    Read the article

  • Office documents on intranet all requiring second login and can't pass auth? Disable webdav?

    - by DOTang
    I am not sure what is going on, but recently all the Office documents on our intranet get prompted a second time for login and according to the error logs it looks like it's trying to use webdav to open (an editable?) version of the document to save directly on the server? We have no sharepoint server setup or anything, but this shouldn't be happening. All I want is for the document to be saved or opened from a local copy in temp like normal. Here is the log: Line 57499: 2011-04-12 15:57:10 (ip) OPTIONS (address) - 443 (username) (user ip) Microsoft-WebDAV-MiniRedir/6.1.7601 - 401 1 1326 1525 238 0 Line 57500: 2011-04-12 15:57:10 (ip) OPTIONS (address) - 443 (username) (user ip) Microsoft-WebDAV-MiniRedir/6.1.7601 - 401 1 1326 1525 238 0 Line 57501: 2011-04-12 15:57:10 (ip) OPTIONS (address) - 443 (username) (user ip) Microsoft-WebDAV-MiniRedir/6.1.7601 - 401 1 1326 1525 238 0 The log basically contains a bunch of these. How can I disable this behavior so that office documents that are downloaded aren't attempted to be used through webdav?? Edit: I should clarify behavior, it asks if you want to save or open it, upon choosing open open, it asks to re-authenicate, you put in the user information and the login box comes up 3 times acting like you entered the wrong password. For some users, after passing the login box the third time, it still opens up, for others their browser just locks up. It also doesn't even look like webdav is installed on our server, I see no config options in IIS for it as outlined on this page: http://learn.iis.net/page.aspx/350/installing-and-configuring-webdav-on-iis-7/#001

    Read the article

  • Solaris 10: How to remove devices from a zpool with /usr currently mounted?

    - by cali-spc
    I use Solaris 10 on SPARC. I have /usr legacy mounted on a zpool 'usr-pool'. I now need to move some of the devices in usr-pool to another zpool which is running out of room. What is the safest way for me to do this? I already know that (since my zpool is not mirrored) I need to destroy and recreate the zpool. I know how to backup and restore a zfs snapshot. However... I'm stumped on how to unmount usr-pool without losing access to the commands I need on /usr to complete the backup/restore. Cursory research indicated that I should boot to OpenBoot (init 0) and then 'boot cdrom -s'. I did this but none of the zpools are accessible on that runlevel. I also read I could just copy /usr to another location, symlink /usr to that location, then do my backup/restore. Is that safe to do? I would appreciate some guidance. S.

    Read the article

  • moving files and directories between two machine, via a third, preserving permissions and usernames

    - by Jarmund
    The situation is as follows: Machine A has a file repository accessible via rsync Machine B needs the above mentioned files with all permissions and ownerships intact (including groups etc) Machine C has access to both A and B, but has a completely different set of users. Normally, i would just rsync everything over, directly between A and B, but due to severely limited bandwidth at the moment, i need something different, as rsync times out after building the list of the 430 files (49Mb uncompressed... can be compressed down to ~7Mb). What i've tried so far: rsync everything over from A to C, tar it, copy the tarball over, and then untar it, however, this messes up the ownership and/or the permissions. To rsync it from A to C, i run this command: rsync --numeric-ids --password-file=/root/rsync_pwd_file -oaPvu rsync://[email protected]/portal_2/ ./portal_2/ ...and from the looks of things, they do end up on C with the correct ownerships/permissions/flags/everything (not 100% sure, though.. are there any more switches i can throw in there? did i miss something?) copying the tarball over is simple enough (slow as a one-legged turtle due to the bandwidth, but it checksums out alright) What i'm unsure of is the flags and switches for creating and extracting the tarball, so could someone please provide the full commands for creating a tarball from /root/portal_2 on machine C (with everything intact) and extracting the tarball into /var/ex/portal_2 on machine B? ? Also, are there any other approaches worth mentioning that could allow me to perform this? I have root access to A and C, whereas i only have rsync access to B. PS: I'm running rsync v2.6.9 on machine B, and unfortunately i do not have the oportunity to upgrade to v3

    Read the article

  • Windows Server 2008 IIS7 - page requests randomly hang or timeout

    - by seb835
    I've just installed a fresh copy of Windows Server 2008 x64 with IIS7 and PHP (fast CGI). However, I'm noticing that after moving my web site from a similarly specified machine to this one, I'm getting issues. The issue seems to be that randomly, as I'm clicking through the web pages being served, the browser will suddenly hang saying "waiting for mysite.com...". Sometimes the page will then timeout, or finally resolve after 20 to 30 seconds, but maybe missing the CSS style sheet. Very strange, as this is the only website installed on this new server, and only myself using/testing it. Server is installed in the same data centre below the old server. Has anyone else had a similar problem? I tried increasing the max workers pool to 10 (from 1), but this has no effect. Seems to happen most frequently after 1 or 2 minutes of inactivity on the website, then trying to load/refresh a web page. Many thank for any info and help. Kind Regards, Seb

    Read the article

  • S3sync not working

    - by user57833
    Hello, I managed to get s3sync to upload my test folder to Amazon S3 and can see it in the MWS Managment Console. Downloading the data back to a test folder results in the following error message: root@mybucketname:/var/s3sync# ./week_download.sh s3Prefix backups/weekly localPrefix /var/s3sync/testdown/weekly s3TreeRecurse mybucketname backups/weekly Creating new connection Trying command list_bucket mybucketname prefix backups/weekly max-keys 200 delimiter / with 100 retries le ft Response code: 200 prefix found: / s3TreeRecurse mybucketname backups/weekly / Trying command list_bucket mybucketname prefix backups/weekly/ max-keys 200 delimiter / with 100 retries l eft Response code: 200 S3 item backups/weekly/ s3 node object init. Name: Path:backups/weekly Size:0 Tag:d41d8cd98f00b204e9800998ecf8427e Date:Fri O ct 29 14:21:53 UTC 2010 local node object init. Name: Path:/var/s3sync/testdown/weekly/ Size: Tag: Date: source: dest: Update node s3sync.rb:638:in initialize': No such file or directory - /var/s3sync/testdown/weekly/.s3syncTemp (E rrno::ENOENT) from s3sync.rb:638:inopen' from s3sync.rb:638:in updateFrom' from s3sync.rb:393:inmain' from s3sync.rb:735 I am using the following download script: !/bin/bash script to download local directory upto s3 cd /var/s3sync/ export AWS_ACCESS_KEY_ID=nothing to see here export AWS_SECRET_ACCESS_KEY=nothing to see here export SSL_CERT_DIR=/var/s3sync/certs ruby s3sync.rb -r -v -d --progress --make-dirs mybucket:backups/weekly /var/s3sync/testdown copy and modify line above for each additional folder to be synced Any idea's? Does the download script need to download to the source of Amazon S3 i.e testup folder? Was hoping on the instance of a complete failure and the original folders won't exist that it would just download everything from me. Note: changed my bucket names to "mybucketname" so that it is not public!

    Read the article

  • Something like Dropbox for local use

    - by Casper
    I am looking for a solution to sync folder pairs between a NAS and multiple local macs. Each of the macs could edit files and the other macs should then get synced automatically. Basically my own local version of Dropbox without using "cloud-storage". I have looked into solutions using rsync. As I understand it rsync is not really capable of doing a bi-directional sync. I also do not want to necessarily invoke the sync process. I would prefer a daemon running in the background - waiting and checking for changes and then syncing them "live". The program should also be flexible enough to recognize that it sometimes (in the case with laptops) can not reach the NAS. It should then just wait for the connection to be back again, without bugging me ever few minutes. I have looked into synk, folderwatch, rsync and a few others, but I haven't really found a solution. Isn't there something like "offline folders" from microsoft for the mac? Thanks PS: just for clarification - I don't want to sync for backup purposes, instead I am wanting to sync so that all macs have a local copy of the most recent changes to files.

    Read the article

  • Seagate 3TB hard drive loses format information

    - by Victor Bugarin
    I have a Windows 7x64 Ultimate, 6 GB memory, 1 TB HD. 3TB Barracuda XT HDD. The HDD is installed on a StarTech 4 bays external enclosure I had troubles so I converted to a GPT, created 1 partition and formatted as NTFS. The hard drive I can write and read to and from the hard drive but it will become unreadable at some point while I am copying files or after I have copied files to it. I have copied large Bluray movies and diverse video files, I have also copied 32 GB of pictures, and I have copied about 86 thousand music files in different formats. At some point the partition becomes unreadable and I have to format the partition again (all files lost) and I have to start the whole process again. At some point I have been unable to copy large ISO (Bluray movies) file images. I have partitioned the HDD in 2 partitions P1 - 2TB, P2 - 1TB and I have lost every single file in either partition the same way. I reformat the HDD and it seems fine. I have run seatools to check the hard drive and it reports to be OK. What gives?

    Read the article

  • Never getting a JSON response when running server-side PHP proxy script but I do with others

    - by Dohk
    I'm on PHP 5.3.4 and Apache 2.2 btw So I'm using (or trying to use) Simple PHP Proxy (Simple PHP Proxy) I enter a URL at his example page at SPP Example Page and it works fine, I see the JSON response and all the headers. However, when I copy the exact URL, only changing the URL to now have localhost, I get both empty headers and no JSON. Assuming that the script on his site is the same I downloaded, could this be due to a multitude of things or a setting in Apache and/or the PHP ini? So for example: benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=http://github.com/&full_headers=1&full_status=1 That will get me a ton of info back Now changing to localhost http://localhost/ba-simple-proxy.php?url=http://github.com/&full_headers=1&full_status=1 {"headers":[],"status":{"url":"https:\/\/github.com\/","content_type":"text\/html","http_code":301,"header_size":194,"request_size":182,"filetime":-1,"ssl_verify_result":0,"redirect_count":1,"total_time":0.094,"namelookup_time":0,"connect_time":0.047,"pretransfer_time":0,"size_upload":0,"size_download":185,"speed_download":1968,"speed_upload":0,"download_content_length":185,"upload_content_length":0,"starttransfer_time":0,"redirect_time":0.047,"certinfo":[]},"contents":null} I even went basic and just used some curl and of course, empty objects being returned other than false for my content and the url I set in my JSON. Any help is deeply appreciated or any ideas.

    Read the article

  • Windows Vista freezes

    - by Kakurady
    Windows Vista (32-bit) would randomly freeze on my computer, usually 15-30 minutes after login but can happen just after login. All applications would stop responding and the hard drive will not make any sound, and after a while, the mouse cursor will also stop moving. I dual-boot Ubuntu, and that still works fine. It started with the computer freezing when loading Team Fortress 2. Alt-Tab and Ctrl-Alt-Del have no effect, and the hard drive does not make any sound. I tried to verify the game data using Steam and that freezes the computer too. So I stupidly reinstalled the game. Now the game doesn't freeze when it starts, but instead the whole computer randomly freezes. This computer is a Dell XPS M1530 with a 320GB (298GiB) drive (WDC WD3200BEVT-7) split 5-ways, with Windows and Linux a partition each, one more for Linux swap space, and another two partitions for Dell diagnostic program and factory image and drivers. There was once where the hard drive would make clicking noises all day, and only stopped when I rebooted the computer. Since then, the BIOS diagnostics would fail the drive (for "self-test log contains previous errors") whenever ran. (The on-disk diagnostics cannot be run because I overwrote the MBR with GRUB.) Naturally, I thought the hard drive could be the problem. CHKDSK found one bad sector, but this seems to have no effect. System File Checker found two protected files with wrong hashes, one is some kind of IE manifest, and the other is a tcpmon.ini. Neither of them can be restored because their back up copy also have wrong hashes. Nothing about system failures in the event viewer. What should I do next?

    Read the article

  • How to do a Windows 7 Image restore to an external drive?

    - by Vaccano
    I have a system that I have done a Windows 7 Image restore on. I would like to migrate that image to a different hard drive. Is there a way to restore the image to an externally connected hard drive? For example: I have 3 hard drives: The first in the source machine (the one I want to copy). The second in a machine that I want to do the work. And the third is not in a machine. It is the target that I want to overwrite with the contents of the first. I boot up a 2nd machine and connect the 3rd hard drive externally (using some cool cables I have). I then use some cool feature of Windows 7 to replace what is on the 3rd hard drive with the windows 7 image of my 1st machine (that is on on my networked backup server). I need to know what the above mentioned "cool feature of windows 7" is, if there is one. And how to use it. Any ideas? Note: that I very much so don't want it to overwrite what is on the 2nd machine/hard drive.

    Read the article

  • Windows 7 - ignore security when reading external drive

    - by w-
    hi, My system hard drive on an XP computer kind of failed (random corrupt sectors). So i got a new harddrive and am trying to recover the files. The filesystem is NTFS. The system i'm trying to use when recovering the files is Windows 7. I'm obviously an admin on this box. The last data i'm trying to recover is stuff in the Documents and Settings folder. I'm using a SATA to a USB cable thingy so that I just plug it in as an External Hard Drive. The problem: In Windows Explorer when i try to copy the data, I keep getting prompted with Security warnings and error messages. It keeps telling me i have to change the owner permissions of the folder and all it's contents. If i tell it to change all the files and folder permissions it takes a really long time because it has to recurse through all the folder contents to change the permissions. Is there a way for me to ignore the file permissions when doing this? thanks

    Read the article

  • Recovering data from an external hard drive

    - by CCallaghan
    I have a WD Elements 2GB hard drive (formatted NTFS). I accidentally kicked out the USB cable while writing data to the disk, and now I can't access most of the data. Although this was ostensibly my backup drive, there is a great deal of important material on there which was only on there. I realise how idiotic this makes me. (So, formatting is not an option.) Things I've tried/information I've gathered: Windows Explorer will recognise the drive itself. However, it will not access most directories therein (and will sometimes crash when exploring). I can access all of the directories through the command line, but the dir command will often report that it can't read any files in most of the directories. The situation was similar when I hooked it up to an Ubuntu machine: the file explorer crashed, but I could access directories - but not files in those directories - via terminal commands. Several files I tried to copy out either resulted in an I/O error being reported or resulted in the command line crashing. The Disk Management utility on Windows reports a healthy disk formatted as NTFS and not RAW. It also indicates the correct amount of space used up and its capacity (so it seems that the files are not deleted). I've tried to run chkdsk, but that hangs on Step 2 (checking indexes) at 74%. Step 1 reported no bad sectors. I tried Recuva, but that didn't seem to work (stalled at 0% for half an hour). I should also note that the disk doesn't seem to be spinning smoothly; it seems to be chopping back, like it's reading the same sector over and over again. I noticed this after I kicked out the cable. Any help would be greatly appreciated. Update: It would seem the problem has taken a turn for the worse. The external hard drive now shows up on my computer as a local disk and is not mountable by Linux.

    Read the article

  • Master does not appear to be a git repository error

    - by EmmyS
    I've inherited a position and instructions for creating a new git repository. Unfortunately I've run into problems and no one here knows what to do. Hoping someone can help me out. Here are the instructions I was left: Create a new repository: For these steps you need to be in the gitosis-admin repository, if you don't have it, in a suitable parent folder do: git clone [email protected]:gitosis-admin.git Edit gitosis.conf file - in gitosis-admin root, under [group base-repo] section, add the name of the new repo to the end of the "writable =" section. Commit change and push back to gitosis-admin master. For the next commands, my_new_project represents the name of your project mkdir my_new_project cd my_new_project git init Copy in any files you want to use to start the repo git commit -a -m "Initializing new repository" git remote add origin [email protected]:my_new_project.git git push master git push master:qa So I did 1 and 2, with no problem. It created a local folder on my machine called gitosis-admin. I edited the gitosis.conf file as indicated. But when I try to do step 3 (which I assume is git push gitosis-admin master) bash tells me that fatal: 'master' does not appear to be a git repository fatal: The remote end hung up unexpectedly What am I doing wrong?

    Read the article

  • How to setup bindings for development IIS 7.5 with lot of sites

    - by Antonio Bakula
    I am a programmer in a small ASP.NET shop with very little expirience in server administration, and I have to setup IIS 7.5 to host lot of sites on newly installed windows server 2008 R2, these sites are test "clones" for sites on "real" web server and they should be accessible only in local network (domain). Developers should add new sites for our new customers. Project managers use this server to check progress and test new sites and new features, QA people have to have access to this site and test before we copy it to the "real" web server. Developers only have access to IIS console, in fact they can use RDP to test server with their developer domain credentials and permissions, also developers are local admins on that machine (tester). On our previous server I used different port numbers for each site. That worked but don't like this solution, I would prefer to use subdomains. But here are the problems: manually adding DNS records is not an option because we do not wont that developers have to administer domain DNS server, and currently this had to be done with domain administrator credentials. Is there a some way to add DNS record automatically ? I tried to add DNS record for subdomains on test server with wildcard (*.tester) and that seems to work for some time but that change coused some bad problems in our domain network and admin forbid me to mess with DNS, he said that I have to add DNS record for every subdomain manually and that I can not use wildcards, and there is nothing that I can do about it, mainly for "politicall" reasons :( obviously our admin is pretty much uncooperative, outsorced from different organization and I can't do anything about that. can I add another DNS server on that machine ? What must be setup on clients machines to "tell" them to use domain DNS server and tester domain server ? So please I need someone to give me some advice, what should I do ? Is different port numbers only option left ? Thanks !

    Read the article

  • Hibernation fails; The system cannot find the file specified

    - by GMMan
    Recently I installed Ubuntu 12.04.1 LTS on my Lenovo Y480. Hibernation was working properly after the Ubuntu install, but I was making sure all of the operating systems on my system worked, including OneKey Recovery (recovery partition). It is of note that I installed Windows 7 from scratch with a disk image I downloaded off of my university's DreamSpark program, and further to that I had to image the partition with Paragon Backup & Recovery, repartition to convert the Windows partition to extended, install Ubuntu, and then restore the image. During that process I also used the Windows disc to edit the BCD as to reuse the existing entry for the restored partition. I also used the automated "repair your computer" option. With verification, I noticed that the "repair your computer" option actually wrote to the wrong BCD (the recovery partition), and I mounted the partition and restored the original BCD (from a copy I made earlier), and rebooted. At this point my GRUB broke, and I was able to restore it. At this point hibernation broke. I tried powercfg /h off and powercfg /h on, rebooted, and nothing. Also tried increasing the hibernation file size as directed on this post, but it still doesn't work. Executing shutdown /h yields The system cannot find the file specified.(2). What file? It seems that mounting the system partition sometimes works, but I don't want to keep it mounted in case it gets written to accidentally. How do I permanently fix this?

    Read the article

  • Why are all of my ZFS snapshot directories empty?

    - by growse
    I'm running an Oracle 11 box as a ZFS storage appliance, and I'm taking regular snapshots of the ZFS filesystems, via cron. In the past, I know that if I wanted to grab a particular file from a snapshot, a read-only copy was kept in .zfs/snapshot/{name}/ and I could just navigate there and pull the file out. This is documented on Oracle's website. However, I went to do this the other day, and noticed that the ZFS directories within the snapshot directories are all empty. zfs list -t snapshot correctly shows the list of snapshots that should be present, and .zfs/snapshots correctly contains a directory for each snapshot, and in each snapshot there is a directory present for each ZFS filesystem. However, these directories appear to be empty. I just tested a restore by touching a file in a little-used share and rolling back to the latest hourly snapshot, and this appears to have worked fine. So the rollback functionality is there. Did Oracle change how snapshots are done? Or is something seriously wrong here?

    Read the article

  • Transfer of ownership of Windows 7

    - by ziggy
    I am thinking of purchasing a copy of Windows 7 via either ebay or GumTree. I am unsure as to how the product key works. A close friend of mine is warning me against buying it from ebay as he is suggesting that once it has been used, the operating system registers itself on microsoft servers using the serial number of the motherboard of the system where it has been installed. This means once installed on one machine you wont be able to install it on another machine. Now i am struggling to believe that an operating system can only be installed on one machine. Can someone please explain exactly how this works. I can see a lot of copies being sold on Ebay which are used. I used the 'Ask a question' option and the majority of the users are saying that i should be able to use it. If someone buys Windows 7 from the shop, installs it on his PC but then decides that he wants to sell it can he not sell it? Will the person buying it not be able to use it? Does the person selling it have to somehow unregister it first? What do i need to look out for if buying it from Ebay? Thanks

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • Getting the EFS Private Key out of system image

    - by thaimin
    I had to recently re-install Windows 7 and I lost my exported private key for EFS. I however have the entirety of my user directory and my figuring that the key must be in there SOMEWHERE. The only question is how to get it out. I did find the PUBLIC keys in AppData\Roaming\Microsoft\SystemCertificates\My\Certificates If I import them using certmg.msc it says I do have the private key in the information, but if I try export them it says I do not have the private key. Also, decryption of files doesn't work. There is also a "keys" folder at AppData\Roaming\Microsoft\SystemCertificates\My\Keys. After importing the certificates I copy those over into my new installation but it has no effect. I am starting to believe they are either in AppData\Roaming\Microsoft\Protect\S-1-5-21-...\ or AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-...\ but I am unsure how to use the files in those folders. Also, since my SID has changed, will I be able to use them? The other parts of the account have remained the same (name and password). I also have complete access to the user registry hive and most of the old system files (including the old system registry hives). I do keep seeing references to "Key Recovery Agent" but have not found anything about using, just that it can be used. Thanks!

    Read the article

< Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >