Search Results

Search found 9410 results on 377 pages for 'special folders'.

Page 285/377 | < Previous Page | 281 282 283 284 285 286 287 288 289 290 291 292  | Next Page >

  • ASDIEdit Cleanup After Exchange 2003 Crash During Transition To Exchange 2010

    - by ThaKidd
    Hello all. I would value some input from a few experts. I have almost completed the transition from Exchange 2003 Standard to Exchange 2010 Standard. Everything went smoothly until I tried to uninstall Exchange 2003. At that point the server bit the dust and died completely. I now have NO access to the old Exchange System Management MMC as I am running Windows 2008 SR2 and Windows 7 only. I can only fix this with ASDIEdit, EMShell, and EMConsole. I have used the 2010 shell to move/remove/verify that all mailboxes, public folders and OAB are hosted on Exchange 2010. I also verified that the routing connector has been deleted. The only two things that were not done was to remove the Recipient Update Service and actually perform the removal of the 2003 software. I have spent a lot of time going through ASDIedit and have located the old Administrative Group and the Exchange 2003 server listed under it. I also located the Recipient Update Service which includes two entries; Enterprise and my domain name. I have read that it is an unwise idea to remove the old administrative group so I won't bother messing with that. I am repeatedly getting three warnings in the Application Log. Both are from MSExchangeTransport EventID 5006 (Cannot find route to Mailbox Server OLDSERVER) and 5020 (The topology doesn't contain a route to Exchange 2000 Server or Exchange Server 2003) So my questions are: To clean out AD of the old Exchange 2003 info, can I delete the server name folder (Configuration - Services - Microsoft Exchange - ExchOrg - Administrative Groups - First Administrative Group - Servers - Old Server) and also delete the Update Recipient Service (Enterprise) and Update Recipient Service (DOMAIN) containers safely? Are there any additional items I need to address to ensure the AD is clean? Thanks in advance for your help!

    Read the article

  • Recovery of Windows DFS partion with shadow copy versioned files when overwritten with older modifie

    - by patjbs
    I've noticed the following "bug" on a DFS volume with shadow copies: Pretend you have the following folders/files under shadow copy versioning, going back two weeks. MyDirectory+ MyFile - Modified Date 8/1/2009 The current date: 8/30/2009 You have another version of MyFile stored elsewhere, with a modified date of 7/1/2009. Copy your other version of MyFile into MyDirectory, overwriting the newest version. I expected that you could roll back to the version that was there when it last imaged, say on the prior day and recover your 8/1 version. Not the case. Now, when you go to look at previous versions for the past two weeks, the versioning of that file will be entirely lost, and you'll be stuck with your older 7/1 version. Suckage. Questions: (1) Is this intentional, and if so, what's the rationale? I assume that DFS picks up on the versioning based on the current file, and that's what's wiping out prior versions, but it seems like a fairly stupid/naive way of handling versioning to me. (2) Is there a way to backtrack out of this, without resorting to restoration from other backup mediums? Thanks!

    Read the article

  • Enabling syntax highlighting for LESS in Programmer's Notepad?

    - by Cody Gray
    When I don't feel like firing up the Visual Studio behemoth, or when I don't have it installed, I always turn to Programmer's Notepad. It's an amazingly light and fast little text editor, with the special advantage that it is completely platform-native and conforms to standard UI conventions. Therefore, please do not suggest that I consider using other text editors. I've already considered and rejected them because they do not use native UI controls. I like Programmer's Notepad, thank you very much. Unfortunately, I've recently begun to learn, use, and love LESS for all of my CSS coding needs, and it appears that Programmer's Notepad is not bundled with a syntax highlighting scheme for LESS. Does anyone know if there is—by chance and good fortune—one already available somewhere on the web that some kind soul has tediously prepared? If not, how can I go about writing one of my own? Is there a way to build on the existing CSS scheme? It's also possible that any code coloring scheme designed for Scintilla-based editors will work, as Programmer's Notepad is based on the Scintilla control. If you know of a LESS highlighting scheme for Scintilla-based editors, and how to use that with Programmer's Notepad, please suggest that as well.

    Read the article

  • Unconvert Text File from Binary Format

    - by Hammer Bro.
    I've got a rather large CSV file (~700MB) which I know to consist of lines of 27-character alpha-numeric hashes; no commas or anything fancy. Somehow, during its migration from Windows to Linux (via winSCP and then a few regular SCPs), it has converted into some kind of binary format I am unfamiliar with. If I open the file in vi, everything appears fine, and it says [converted] at the bottom, although I know it's not a line endings issue (and dos2unix doesn't help). If I 'head' the file, it looks proper except for a "ÿþ" at the beginning of the first line. If I open up the file in nano, however, I see the "ÿþ" at the start and then "^@" before every character (even newlines and EoF). If I try to re-save or copy the file (say via: head file.csv short.txt), this special encoding is preserved. I copied the first ten lines out of vi (which displays it properly) into my Windows clipboard via my SSH client, then pasted it into a new text file, test.txt. This file is visually identical when opened in vi (and similar through 'head', minus the "ÿþ"), although it's roughly half of the filesize. Additionally, file test.txt test.txt: ASCII text file short.txt short.txt: I have no idea what format this once-text file got converted to (it's notoriously hard to search the internet for symbols), but surely there must be some way to convert it back. Any ideas?

    Read the article

  • Outgrew MongoDB … now what?

    - by samsmith
    We dump debug and transaction logs into mongodb. We really like mongodb because: Blazing insert perf document oriented Ability to let the engine drop inserts when needed for performance But there is this big problem with mongodb: The index must fit in physical RAM. In practice, this limits us to 80-150gb of raw data (we currently run on a system with 16gb RAM). Sooooo, for us to have 500gb or a tb of data, we would need 50gb or 80gb of RAM. Yes, I know this is possible. We can add servers and use mongo sharding. We can buy a special server box that can take 100 or 200 gb of RAM, but this is the tail wagging the dog! We could spend boucoup $$$ on hardware to run FOSS, when SQL Server Express can handle WAY more data on WAY less hardware than Mongo (SQL Server does not meet our architectural desires, or we would use it!) We are not going to spend huge $ on hardware here, because it is necessary only because of the Mongo architecture, not because of the inherent processing/storage needs. (And sharding? Please! Cost aside, who needs the ongoing complexity of three, five, or more servers to manage a relatively small load?) Bottom line: MongoDB is FOSS, but we gotta spend $$$$$$$ on hardware to run it? We sould rather buy commercial SW! I am sure we are not the first to hit this issue, so we ask the community: Where do we go next? (We already run Mongo v2) Thanks!!

    Read the article

  • IMAP proxy as a POP3 hub?

    - by mailman stan
    Simple scenario, complicated technology: One family receiving mail from five email addresses via POP3 into one Outlook inbox on a single PC. Now we'd like to be able to replicate that single inbox across multiple devices (eg. desktop PC, laptop, netbook, smartphone). If we continue using POP3 as the mail transfer protocol, messages will be downloaded to one device and will not be visible to the others; replies will likewise be isolated on the sending machine. If we switch to IMAP, I understand that we can have multiple devices maintaining a shared view of an inbox hosted at the server end, but what about multiple accounts? I tried changing the account configuration in Outlook to fetch from the mail providers' IMAP service instead of POP3, which does give a shared view across multiple devices but also causes Outlook to create a separate inbox and PST for each account. This is awkward because it means there are five separate folders that need to be checked, and Outlook tools like search filters and rules don't seem to work across accounts. To get what I want (five accounts delivered into one shared mailbox) it seems that I would need some sort of intervening server that collects mail (using POP3) from all our accounts into a single inbox while preserving the original destination addresses, and then serves it up to all our devices using IMAP. Is this workable? Is it a good approach? Is there an easier way?

    Read the article

  • Drive XML returning Windows Volume Shadow Service Error

    - by Ssvarc
    I'm trying to image a SATA laptop hard drive, using DriveImageXML, that is attached to my computer via a USB adapter. I'm running Win7 Ultimate 64 bit. DriveXML is returning: Could not initialize Windows Volume Shadow Service (VSS). ERROR C:\Program Files (x86)\Runtime Software\Drivelmage XML\vss64.exe failed to start. ERROR TIMEOUT Make sure VSSVC.EXE is running in your task manager. Click Help for more information. VSSVC.EXE is running in Task Manager, as is VSS64.exe. Looking at the FAQ on the Runtime webpage this turned up: Please verify in Settings-Control Panel-Administrative Tools-Services that the following services are enabled: MS Software Shadow Copy Provider Volume Shadow Copy Also make sure you are able to stop and start these services. Possible reasons for VSS failures: For VSS to work, at least one volume in your computer must be NTFS. If you use only FAT drives, VSS will not function. The required NTFS volume does not need to be identical with the volume you want to image. You should make sure that VSSVC.EXE is running in your task manager. If the problems persist, registering "oleaut.dll" and "oleaut32.dll" using "regsvr32" might help. Both of those services are running and can be started and stopped without issue. Using "regsvr32" to register ""oleaut32.dll" returns successful, but "oleaut.dll" returns: The module "oleaut.dll" failed to load. Make sure the binary is stored at the specified path or debug it to check for problems with the binary or dependent .DLL files. The specified module could not be found. Some other information that might be relevant. Browsing to the drive is successful, but accessing certain folders returns an "access" error. Windows runs a permissions adder that adds the current user profile to the NFTS permissions. Could this be the cause of the issue? DriveImage XML is running as Administrator. Thoughts?

    Read the article

  • Outlook errors and blank address book?

    - by Chasester
    I have a user with a fresh install of Windows 7 64 bit. Office 2010 was installed also. Everything worked great until we installed Adobe Pro 9 and Wordperfect 12. (for outlook we are not using exchange). Then Outlook popped an error saying could not start Outlook. I then modified it to run in compatibility mode and Outlook starts. Then took compatibility mode off and outlook continues to start without difficulty. However, her address book went blank (contacts are all there). In the properties of the contact folder it is checked (and grayed out) to include this folder in the address book, and the Outlook address book is listed when I look at account properties. I tried creating a new profile to no avail. I tried creating a new profile and creating a new pst file - to no avail. I tried uninstalling office, removing the folders inside roaming and local, reinstalled office; got the same could not start Outlook error - did the compatibility mode bit and got it to start - but the Address book continues to be empty. Has anyone run across this before? I'm thinking that there must be some other preferences type folder other than those in Roaming and Local since her signature remained when I reinstalled Office.

    Read the article

  • ESXi 4.0 - cannot copy files

    - by Peter
    I am unable to copy files or make directories on my installation of VMWare ESXi 4.0. I have done so in the past (copied an iso onto a datastore). But something has changed and I have no idea what. I cannot copy using the datastore browser (get a dialog saying "Expected a PUT_FILE_DONE message. Got SESSION_COMPLETE"). I cannot create a directory through datastore browser (get a dialog saying "Cannot complete file creation operation"). When I ssh to the ESXi server I cannot create files or folders under /vmfs/volumes. But I can manipulate files elswhere (including /vmfs). Here are the permissions for the directories (I am logged in as root). ~ # ls -lh /vmfs/volumes/ drwxr-xr-t 1 root root 1.2k Sep 3 12:19 4a76f260-36b7eb85-c3b3-0024e8314929 drwxr-xr-x 1 root root 8 Jan 1 1970 4a76f261-d6190a9e-3b89-0024e8314929 drwxr-xr-t 1 root root 1.4k Sep 22 10:38 4a76f262-4ac21f0a-6bc1-0024e8314929 l--------- 0 root root 1.9k Jan 1 1970 Hypervisor1 - c42ce27f-eb8d7f70-7f70-0e7a85e8edc4 l--------- 0 root root 1.9k Jan 1 1970 Hypervisor2 - bbf1477b-4aec1d8c-caa5-5e8720bebd85 l--------- 0 root root 1.9k Jan 1 1970 Hypervisor3 - efd8efe3-03bc1cbf-15e0-080efd9e7379 drwxr-xr-x 1 root root 8 Jan 1 1970 bbf1477b-4aec1d8c-caa5-5e8720bebd85 drwxr-xr-x 1 root root 8 Jan 1 1970 c42ce27f-eb8d7f70-7f70-0e7a85e8edc4 l--------- 0 root root 1.9k Jan 1 1970 datastore1 - 4a76f260-36b7eb85-c3b3-0024e8314929 l--------- 0 root root 1.9k Jan 1 1970 datastore2 - 4a76f262-4ac21f0a-6bc1-0024e8314929 drwxr-xr-x 1 root root 8 Jan 1 1970 efd8efe3-03bc1cbf-15e0-080efd9e7379 ~ # touch /vmfs/foo.txt ~ # touch /vmfs/volumes/foo.txt touch: /vmfs/volumes/foo.txt: Operation not permitted I've googled and found nothing helpful. Does anyone out there have an idea as to what is going on? Thanks in Advance. Pete.

    Read the article

  • File recovery from Mac results in random files and extensions – how do I get my data back?

    - by Robsta
    This Mac hard drive was dying. Someone I knew did a file recovery and got as many files as he could. The program (not sure how it was done, or what program it was) dished out a bunch of folders names such as: DIR56.TOC DIR55.CUR DIR54.GPZ DIR53.GZI … and so forth, all the way down to DIR0.LZH. Some of the file extensions I do understand — like .JPEG, or .MOV — but most of them are ones I've never heard of. I've googled some of them like .TOC, wich stands for "table of contents", but I don't understand how to transfer that data back to the Mac. Currently, they are on a Windows machine. They are being transfered onto an external hard drive that the Mac can read. It can also see all the files. However, the few that I tested to see if the Mac recognizes them (like .TOC and .CUR) cannot be opened. Anyone have any idea as to what I should do? There are some important assignments on there I need to get. EDIT: Data transfer was most likely done by: Easy Recover 6 professional (95% sure, no guarantee)

    Read the article

  • USB Hardware vs. Software Write Lock

    - by TreyK
    I'm in the market for a USB flash drive, and remember this cool feature a tiny 32MB flash drive of mine had: a write lock switch. This seemed like it would be an amazing feature to have as a shield against any nastiness happening to the drive on an unfamiliar computer. However, very few drives on the market offer this feature. Instead, it seems that forms of software protection are the more prominent method. This software protection causes me a bit of uneasiness, as it seems like this software wouldn't be nearly as bulletproof as a physical switch. Also, levels of protection seem to vary from product to product. Being able to protect certain folders from reading and/or writing would be nice, but is the security trade-off worth it? Just how effective can this software protection be? Wouldn't a simple format be able to clean any drive with software protection? My drive must also be compatible with Windows XP, Vista, and 7, as well as Linux and Mac. What would be the best way forward for getting a well-sized (~8GB) flash drive with a strong write protection implementation, for little or no more than a regular drive? Thanks.

    Read the article

  • unreadable corrupted ntfs partition - lost clusters reported

    - by Eduardo Martinez
    partition magic is reporting multiple 'bad file record signature' and 'lost clusters' errors on my 250GB samsung sata disk (connected via usb on a xp sp3). Unfortunately PM is unable to fix. PM shows the drive as being NTFS, detects used space ok and also drive name. But PM browser (right click on partition, browse...) won't show anything (as if disk was empty) Windows Explorer is not even picking the drive name and reports 'the file or directory is corrupted and unreadable' PTDD partition table doctor demo tells me the boot sector is fine, and I can see all disk content on its browser - but crucially cannot copy that content over to a new disk (PTDD browser is pretty arid to say the least) Also tried - photorec-6.11.3 - it actually started to extract files but wouldn't keep file names or any folder structure (maybe I missed sth on the configuration options) - find and mount - intellectual scan went well, the only partition on the disk was detected, then tried to mount into p: but got this error on windows explorer: 'p:\ is not accesible. The media is write protected'. Find and mount allows you to create an image from partition but I don't have a disk big enough at hand. Does anyone know if this will keep the extracted files/folders structure intact? I'm starting to think the disk is pretty screwed and my chances to recover this data are slim. Please someone enlighten me with that marvellous piece of software I am missing :-) Thanks in advance

    Read the article

  • Subversion and Quickbooks Files

    - by Jorge Fernandez
    I currently have a large problem on one of the file servers I manage for an Accounting Firm. Quickbooks has a tendency to create multiple files of the same thing over and over to prevent data loss. This is a good thing when you handle just a few files. But at an accounting firm it becomes a problem. Some of the older clients have 5-10 files in their respective folders, each with a different cut off date. Because of user error some of these file aren't labeled properly with their correct cutoff dates. This is where Subversion came to mind. Using the revision system would allow for 1 file to be master and have all of its revisions. Has anyone ever tried this with Quickbooks files? I've only used SVN with code for applications making each file size much smaller. How does SVN stand up with larger files like 10-25MB? I'm not exactly sure how SVN handles revisions - does it keep a duplicate of the files and duplicates the disk space space needed?

    Read the article

  • Perl TDS character sets

    - by skiphoppy
    I'm using the FreeTDS driver with DBD::Sybase, connecting to an MS SQL Server. When I query certain values of certain records, I get this error: DBD::Sybase::st fetchrow_arrayref failed: OpenClient message: LAYER = (0) ORIGIN = (0) SEVERITY = (9) NUMBER = (99) Server , database Message String: WARNING! Some character(s) could not be converted into client's character set. Unconverted bytes were changed to question marks ('?'). This seems to happen for records that contain special Windows character-set characters, such as curly quotes, copied and pasted from people's Outlook and Word messages. Unfortunately, I do not have any control of this database; sanitizing the input on the way in is obviously the way to go, but is not available to me. What FreeTDS settings do I need to change to be able to successfully query these records? Additional information: The query works fine from tsql. I only get this error through Perl's DBD::Sybase interface. (Should I test through something else? I don't have the expertise yet to install PHP or Python. I've got jTDS and can use it, but I think that's a completely different implementation, not an interface to FreeTDS.) Adding client charset = UTF-8 to my freetds.conf file results in "Out of memory!" printed to STDERR.

    Read the article

  • mdadm superblock hiding/shadowing partition

    - by Kjell Andreassen
    Short version: Is it safe to do mdadm --zero-superblock /dev/sdd on a disk with a partition (dev/sdd1), filesystem and data? Will the partition be mountable and the data still there? Longer version: I used to have a raid6 array but decided to dismantle it. The disks from the array are now used as non-raid disks. The superblocks were cleared: sudo mdadm --zero-superblock /dev/sdd The disks were repartitioned with fdisk and filesystems created with mfks.ext4. All disks where mounted and everything worked fine. Today, a couple of weeks later, one of the disks is failing to be recognized when trying to mount it, or rather the single partition on it. sudo mount /dev/sdd1 /mnt/tmp mount: special device /dev/sdd1 does not exist fdisk claims there to be a partition on it: sudo fdisk -l /dev/sdd Disk /dev/sdd: 2000.4 GB, 2000398934016 bytes 255 heads, 63 sectors/track, 243201 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xb06f6341 Device Boot Start End Blocks Id System /dev/sdd1 1 243201 1953512001 83 Linux Of course mount is right, the device /dev/sdd1 is not there, I'm guessing udev did not create it because of the mdadm data still on it: sudo mdadm --examine /dev/sdd /dev/sdd: Magic : a92b4efc Version : 1.2 Feature Map : 0x0 Array UUID : b164e513:c0584be1:3cc53326:48691084 Name : pringle:0 (local to host pringle) Creation Time : Sat Jun 16 21:37:14 2012 Raid Level : raid6 Raid Devices : 6 Avail Dev Size : 3907027120 (1863.02 GiB 2000.40 GB) Array Size : 15628107776 (7452.06 GiB 8001.59 GB) Used Dev Size : 3907026944 (1863.02 GiB 2000.40 GB) Data Offset : 2048 sectors Super Offset : 8 sectors State : clean Device UUID : 3ccaeb5b:843531e4:87bf1224:382c16e2 Update Time : Sun Aug 12 22:20:39 2012 Checksum : 4c329db0 - correct Events : 1238535 Layout : left-symmetric Chunk Size : 512K Device Role : Active device 3 Array State : AA.AAA ('A' == active, '.' == missing) My mdadm --zero-superblock apparently didn't work. Can I safely try it again without losing data? If not, are there any suggestion on what do to? Not starting mdadm at all on boot might be a (somewhat unsatisfactory) solution.

    Read the article

  • Catastrophic Failure opening ODBC via Citrix

    - by Joshdan
    We recently had our Citrix server crash unexpectedly. When it came back up, there was a new issue -- every ODBC connection fails with "Catastrophic Failure" (0x8000FFFF). The issue is limited to Citrix / ICA connections; logging in as the same user via RDP works as usual. The following code is my minimal test case (for wscript): ''// test_odbc.vbs strConn = "Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=c:\files\;" Set rs = CreateObject("ADODB.recordset") strSQL = "SELECT * FROM myFile.csv" wscript.echo "Press OK to Test" ''// This line breaks over Citrix, but not over Terminal Services ''// ---------------------- rs.open strSQL, strConn, 3,3 ''// ---------------------- wscript.echo rs("a") Any insight would be greatly appreciated. Windows Server 2003 SP1, Citrix MetaFrame Presentation Server 4.0. Clients include at least versions 10.2-11 running on 2000-Vista, OS X. ODBC error happens whether a DSN is used or not, on at least Access, MS-SQL, and CSV. Connections both through the SSL Gateway and directly. There have been a few users actually able to log in without trouble, but I can't pin down anything special about them.

    Read the article

  • What's up with stat on MacOSX/Darwin? Or filesystems without names...

    - by Charles Stewart
    In response to a question I asked on SO, Give the mount point of a path, one respondant suggested using stat to get the device name associated with the volume of a given path. This works nicely on Linux, but gives crazy results on MacOSX 10.4. For my system, df and mount give: cas cas$ df Filesystem 512-blocks Used Avail Capacity Mounted on /dev/disk0s3 58342896 49924456 7906440 86% / devfs 194 194 0 100% /dev fdesc 2 2 0 100% /dev <volfs> 1024 1024 0 100% /.vol automount -nsl [166] 0 0 0 100% /Network automount -fstab [170] 0 0 0 100% /automount/Servers automount -static [170] 0 0 0 100% /automount/static /dev/disk2s1 163577856 23225520 140352336 14% /Volumes/Snapshot /dev/disk2s2 409404102 5745938 383187960 1% /Volumes/Sparse cas cas$ mount /dev/disk0s3 on / (local, journaled) devfs on /dev (local) fdesc on /dev (union) <volfs> on /.vol automount -nsl [166] on /Network (automounted) automount -fstab [170] on /automount/Servers (automounted) automount -static [170] on /automount/static (automounted) /dev/disk2s1 on /Volumes/Snapshot (local, nodev, nosuid, journaled) /dev/disk2s2 on /Volumes/Sparse (asynchronous, local, nodev, nosuid) Trying to get the devices from the mount points, though: cas cas$ df | grep -e/ | awk '{print $NF}' | while read line; do echo $line $(stat -f"%Sdr" $line); done / disk0s3r /dev ???r /dev ???r /.vol ???r /Network ???r /automount/Servers ???r /automount/static ???r /Volumes/Snapshot disk2s1r /Volumes/Sparse disk2s2r Here, I'm feeding each of the mount points scraped from df to stat, outputting the results of the "%Sdr" format string, which is supposed to be the device name: Cf. stat(1) man page: The special output specifier S may be used to indicate that the output, if applicable, should be in string format. May be used in combination with: ... dr Display actual device name. What's going on? Is it a bug in stat, or some Darwin VFS weirdness? Postscript Per Andrew McGregor, try passing "%Sd" to stat for more weirdness. It lists some apparently arbitrary subset of files from CWD...

    Read the article

  • TicTac Photo and Windows 7

    - by Ben
    Hello, My wife has been creating a tictac photo album. I had to upgrade to windows 7 as i had enough of Vista so i backed up the tic tac photo file and the photos to an external hard disk and performed a fresh install of win7. Now here is the problem. TicTacPhoto says it can find the photos in the album. The locations were as follows: Vista: C:\Users\Kelly\Pictures Win 7 C:\Users\Kelly\My Pictures When i try to create a Pictures folder under Kelly it popups a message about merging the two folders and simply moves the pictures to the My Pictures folder. Does anyone know a way to make a foler called pictures so i can eliminate the file path problem and then try again with tic tac photo support to get them to fix my file. My wife is going to kill me as its our wedding album and she has spent upwards of 30hrs designing it and me upgrading to win 7 means its all my fault. She does not understand file paths etc. Im going to try and open the album file in a text editor and see if i can see anything but thought i would ask here as well. Any help appreciated.

    Read the article

  • Cannot open files in Visual Studio but in Delphi and Notepad

    - by Andrew J. Brehm
    About an hour ago Visual Studio 2008 decided that it cannot find files any more. This is on 64 bit Windows Vista. When I right-click on a text file (source code or otherwise) and select "open with" and "Visual Studio 2008", I get the following error (example): Windows cannot find 'C:\Users\ajbrehm\Documents\Visual Studio 2008\Projects\Hello Prism\Hello Prism\Main.pas'. Make sure you typed the name correctly, and then try again. When I right-click the same file and select "open with" and "Delphi 2010" or "Notepad" (both other options available for text files on my system), the file opens correctly. Oddly enough when the file is part of a Visual Studio project and I open the project itself with Visual Studio (this works), I can open the file from within Visual Studio. Any ideas what might be going on? This started about an hour after I made a complete backup of my Vista VM and after I installed IIS 7, SQL Express, and Sourcegear Vault. The first files I noticed couldn't be opened in Visual Studio any more where Pascal source files in checked-outed folders from Vault. And Vault also seems to be unable to see one of the sources files and claims they don't exist. I found out about Visual Studio not opening ANY files any more when I tried to recreate the file Vault refused to see. Update: I just checked. Another user, "administrator", can still open text files with Visual Studio 2008. Both users have administrator rights. Update: I just restored the hours-old backup. Same problem. Apparently whatever triggered this happened before the install of IIS 7 and SQL Express. Never noticed it before.

    Read the article

  • Explorer and open file dialog not responding (Vista)

    - by rohancragg
    Any explorer window opened for the first time on my machine causes the explorer window to display the folders tree and folder path in the address bar immediately but the file/folder list pane is blank and the window displays 'Not Responding' in the title bar, this hangs for up to a minute or more. Any file dialog displays 'Not Responding' in the title bar. The files list is eventually displayed after a few seconds or more. Steps to repro: Close all open instances of explorer Windows Key | Run | [enter a folder path such as 'c:\temp'] Or within any app: use a file open / save dialog Once there is at least one open instance of explorer the performance is still fairly poor but not nearly so bad and file lists are displayed in a timely fashion. What I've tried: Cleaned up registry with CCleaner tool, and uninstalled all other unused software Checked nothing unwanted running at startup with Autoruns Removed any ISO burner/recorder/mount software Still to try Get latest version of everything - especially stuff with shell extension behaviour such as TortoiseSVN Anyone have any other suggestions? Thanks alot. Update I'm wondering if this is related, I'll try the hotfix when I get home and report back: KB972685 - FIX:Explorer.exe hangs when using a shell extension written using MFC Update 2 Before I got a chance to try the hotfix it seems one of the above actions fixed this for me; either the removal of IsoRecorder or TortoiseHg (which I was no longer using anyway). Update 3 A similar issue with Explorer.exe has come back since installing TortoiseHg 1.01 :-(

    Read the article

  • Broke NetBeans file associations in Windows XP how do I get them back?

    - by Serhiy
    I broke my file association in XP... Does anyone have any clue how to fix it? When I right click and select Open With... the application I want to use (NetBeans) to open the file is not on the list... and when I browse for it it won't let me select it (well it does but then won't add it to the list). The way I broke it is by installing 6.7 and then uninstalling 6.5.... since then my file associations have all been broken. I even tried uninstalling NetBeans and reinstalling it again... no luck... I even went as far as adding my own action called "OpenIt" to the file types I wanted... and that works... but only if the file/folders that contain in don't have any spaces... otherwise NetBeans throws a ".....does not exist, or is not a plain file". Thus nothing off the desktop can be opened... Does anyone know of how I can fix this problem? Thanks.

    Read the article

  • How do I Setup Multiple Sites in HostGator Shared Hosting?

    - by cillosis
    I recently decided to consolidate all of my random projects into a single hosting account as it was starting to get very expensive to run each on an individual hosting plan. I purchased the HostGator Baby plan which allows hosting of multiple domains. You have to set it up with a root domain name which is fine (I used my portfolio domain name). As far as file structure, I wanted a folder for each site in /public_html so the structure looks like this: - public_html/ - myportfolio.com/ - ... my files ... - anothersite.com/ - ... my files ... - thirdsite.com/ - ... my files ... I setup add-on domains and pointed them to their respective folders which works fine. My problem is the root domain ex. myportfolio.com expects it's files to be contained at the root of /public_html rather than within it's folder I created. I setup a redirect to point requests for myportfolio.com to myportfolio.com/myportfolio.com/ which works initially except (at least in my WordPress installation) it still references it's root folder as public_html. TL;DR; What is the best way to go about setting up multiple site hosting in a shared hosting environment (i.e. I can't setup vhosts). Does anybody know of any tutorials or videos that walk through this more clearly? Thanks.

    Read the article

  • Online Storage and security concerns

    - by Megge
    I plan to set up a small fileserver. I already own a small server at HostEurope (VirtualServer L, 250GB space), but they don't offer enough space (there is the HostEurope Cloud, but paying for bandwidth isn't an option here, video-streaming should be possible) Requirements summarized: Storage: 2TB, Users: ~15, Filesizes: < 100GB, should be easily reachable (Mount as a networkdrive or at least have solid "communication" software) My first question would be: Where can I get halfway affordable online storages? And how should I connect them to my server? Getting an additional server is a bit overkill, as I know no hoster which allows 2 TB on a small 2 Ghz Dual Core 2 GB RAM thingy (that would be enough by far, I just need much space), and connecting it via NFS or FTP over Internet seems a bit strange and cripples performance. Do you have any advice where I could get that storage service from? (I sent HostEurope a custom request today, but they didn't answer till now. If they can provide me with that space, this question will be irrelevant, but the 2nd one is the more important one anway, don't do much more than recommend me some based on experience, you don't have to crawl hours through hosting services) livedrive for example offers 5 TB for 17€ / month, I'd be happy with 2 TB for 20 €, the caveat is: It doesn't allow multiple users, which leads me to my second question: Where are the security problems? Which protocol is sufficient (I want private and "public" folders etc. the usual "every user has its own and a public space"-thing), secure and fast? (I'd tend to (S)FTP, problem with FTP is: Most of those hosting services don't even allow FTP with mutliple users and single users lead me into "hacking" a solution (you could map the basic folder structure on the main server and just mount every subfolder from the storage, things get difficult with a public folder with 644 permissions though) Is useing something like PKI or 802.1X overkill for private uses?

    Read the article

  • email output of powershell script

    - by Gordon Carlisle
    I found this wonderful script that outputs the status of the current DFS backlog to the powershell console. This works great, but I need the script to email me so I can schedule it to run nightly. I have tried using the Send-MailMessage command, but can't get it to work. Mainly because my powershell skills are very weak. I believe most of the issue revolve around the script using the Write-Host command. While the coloring is nice I would much rather have it email me the results. I also need the solution to be able to specify a mail server since the dfs servers don't have email capability. Any help or tips are welcome and appreciated. Here is the code. $RGroups = Get-WmiObject -Namespace "root\MicrosoftDFS" -Query "SELECT * FROM DfsrReplicationGroupConfig" $ComputerName=$env:ComputerName $Succ=0 $Warn=0 $Err=0 foreach ($Group in $RGroups) { $RGFoldersWMIQ = "SELECT * FROM DfsrReplicatedFolderConfig WHERE ReplicationGroupGUID='" + $Group.ReplicationGroupGUID + "'" $RGFolders = Get-WmiObject -Namespace "root\MicrosoftDFS" -Query $RGFoldersWMIQ $RGConnectionsWMIQ = "SELECT * FROM DfsrConnectionConfig WHERE ReplicationGroupGUID='"+ $Group.ReplicationGroupGUID + "'" $RGConnections = Get-WmiObject -Namespace "root\MicrosoftDFS" -Query $RGConnectionsWMIQ foreach ($Connection in $RGConnections) { $ConnectionName = $Connection.PartnerName.Trim() if ($Connection.Enabled -eq $True) { if (((New-Object System.Net.NetworkInformation.ping).send("$ConnectionName")).Status -eq "Success") { foreach ($Folder in $RGFolders) { $RGName = $Group.ReplicationGroupName $RFName = $Folder.ReplicatedFolderName if ($Connection.Inbound -eq $True) { $SendingMember = $ConnectionName $ReceivingMember = $ComputerName $Direction="inbound" } else { $SendingMember = $ComputerName $ReceivingMember = $ConnectionName $Direction="outbound" } $BLCommand = "dfsrdiag Backlog /RGName:'" + $RGName + "' /RFName:'" + $RFName + "' /SendingMember:" + $SendingMember + " /ReceivingMember:" + $ReceivingMember $Backlog = Invoke-Expression -Command $BLCommand $BackLogFilecount = 0 foreach ($item in $Backlog) { if ($item -ilike "*Backlog File count*") { $BacklogFileCount = [int]$Item.Split(":")[1].Trim() } } if ($BacklogFileCount -eq 0) { $Color="white" $Succ=$Succ+1 } elseif ($BacklogFilecount -lt 10) { $Color="yellow" $Warn=$Warn+1 } else { $Color="red" $Err=$Err+1 } Write-Host "$BacklogFileCount files in backlog $SendingMember->$ReceivingMember for $RGName" -fore $Color } # Closing iterate through all folders } # Closing If replies to ping } # Closing If Connection enabled } # Closing iteration through all connections } # Closing iteration through all groups Write-Host "$Succ successful, $Warn warnings and $Err errors from $($Succ+$Warn+$Err) replications." Thanks, Gordon

    Read the article

  • Writing a script for ash?

    - by rumtscho
    My VPN is behaving funny sometimes, and I have to restart it often. I wanted to write a script which does that for me. It doesn't have to be anything fancy, just a shortcut for the commands I have to type into the terminal. More specifically: it will look at the running processes. If it finds a running vpnc process, it will kill it. Then it will start vpnc. I've written bash scripts of similar complexity, but now I don't have a bash, only an ash. Until now, the only difference I noticed is that there are much less commands available, but then, I don't use it very often. So I have some questions. Is writing ash scripts different than writing bash scripts? Is there something specific to consider when doing it? When the script is ready, how can I deploy it? For bash, I just put the executable file under /usr/lib and run it by typing the file name into the command line, will this work with ash? Are there any special pitfalls to watch out for in the script I want to write? I think that the killing process part may get hairy, if I write something that kills the wrong process, but even then running the script shouldn't break anything permanently, right?

    Read the article

< Previous Page | 281 282 283 284 285 286 287 288 289 290 291 292  | Next Page >