Search Results

Search found 17202 results on 689 pages for 'folder permissions'.

Page 204/689 | < Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >

  • "The directory name is invalid" when trying to install drivers in Windows 7 via Device Manager

    - by Luke
    First off, this computer is not mine, it's a customer's system. Having said that... The hard drive was moved to a new motherboard, CPU, RAM combo, and booted up fine. Customer puts in driver CD, drivers won't load. He brings it into me. Under Device Manager for Windows 7 x64, I see lots of PCI to PCI bridge, one SMBus Controller, and about 20 Unknown Devices. Greeeeeat... So I start with the SMBus driver directly from the Asus website for the motherboard (P8H77-M Pro). If I install from the setup program, it tells me to reboot, then it starts the install. It gets half way through the setup, then fails (An unknown error occurred. Setup will exit). When I try to point to the folder from Device Manager, it starts copying files for the driver, even presents me with the proper name of the device, but says that an error has occurred there as well: The directory name is invalid. Doing some Googling, I saw that many people had this issue with Vista. K, Vista and 7 are similar, maybe the solutions are the same... But they aren't. I tried: Copying the entire driver folder and setup utility to the Program Files folder and running it / selecting it in DM Downloading another set of drivers in case this one is corrupt Disabling UAC Deleting and recreating the %WINDIR%\TEMP folder Removing all references to previous hardware that I could find, even in Device Manager's hidden mode So far, nothing has worked. A wipe and reload will be out of the question

    Read the article

  • Backup with Mercurial and Robocopy?

    - by Andrew Neely
    The Problem We would like to backup our critical files from several network shares to a removable hard drive. We want to automate the backup so we don't have to remember to run it. It needs to finish overnight. Furthermore, we want to be able to preserve multiple versions of each file so we can back out of our user's mistakes easier. Background Information I work in a large Windows-based enterprise with a centralized IT section who is responsible for all backups. Their backups are geared towards disaster recovery and not user error, and require upper-level management approval for any non-disaster recoveries. Several times we have noticed that our backups have failed, we weren't notified. I do not have administrative rights to the server or my desktop. We are trying to backup some 198,000 files spanning about 240 gigabytes. These files rarely change. Our backup drive is one terabyte. My Proposed Solution What I would like to do is to write a batch file using Robocopy with the /mir option along with Mercurial SCM to store all versions of the file. I would do an hg add followed by an hg commit before each execution of Robocopy to save the current state, and then make a mirrored copy of the file structures. The problem is the /mir will delete every folder not present in the source, and Mercurial stores the repository in a .hg folder in the destination folder. Does anybody know how I could either convince Mercurial to store the .hg folder elsewhere, or convince Robocopy not to delete it from the destination? I'm trying to avoid writing a custom program do to copying.

    Read the article

  • Config Apache HTTP Server for Eclipse

    - by hqt
    Maybe this question is silly but I really don't know how to solve. First, as other server, I want to define new server. So, in Eclipse, I go to: WindowsPreferenceServer: 1) When I add new server, in list, no category for apache HTTP server. Just has apache tomcat. So, I click into download additional server adapter--still don't have in list. 2) So, I search. I point to location I have installed. Good, Eclipse sees that is a HTTP Server. And Eclipse see folder to put project into for me (because I use LAMPP so that folder isn't in Apache folder). But here is my problem. When I want to run a new PHP Project. Right click, run on server. A new dialog appear take me to choose which server to run. And, in list of server, no HTTP Server, So, I don't know how to choose Apache HTTP Server !!! (because Eclipse doesn't see which server that I have defined, eclipse just find adapter first) So, if I want to run this project, I must copy all and paste to apache folder. Too handy !!! Please help me. Thanks :)

    Read the article

  • Powershell script to delete sub folders and files if creation date is >7 days but maintain parent folders of sub folders and files <7 days old

    - by Mark
    I'm currently using the Powershell script below to delete all files directories and sub directories of "$dump_path" that are seven days or older based upon the creation date and not modified date. The problem with this script is this: If folder "A" is seven (or more) days old it will be deleted even if its sub folders and files are less then seven days old. What I would like this script to do is this: Delete all files from the root and in all sub folders of "$dump_path" that are seven or more days old but maintain the parent folder(s) of files and folders that are less than seven days old even if that means the parent folders are more than seven days old. If all subfolders and files are seven days or older than the parent folder then the parent can be deleted. Slightly obscure problem I know, but the intention is to have a 7 day retention period of all data in a 'sandbox' location of our shared areas. Also, an added bonus if it could generate a log of what it deletes and e-mails it out post deletion. Thank you for reading and I hope that all makes sense! Mark # set folder path $dump_path = "c:\temp" # set minimum age of files and folders $max_days = "-7" # get the current date $curr_date = Get-Date # determine how far back we go based on current date $del_date = $curr_date.AddDays($max_days) # delete the files and folders Get-ChildItem $dump_path | Where-Object { $_.CreationTime -lt $del_date } | Remove-Item -Recurse

    Read the article

  • Password-protected sharing allows access to users who have no account?

    - by romkyns
    Running Win7 on two computers in my LAN. Computer A has password-protected sharing enabled, and shares a folder. It has a single user account "Bob", and the Guest account is turned off. The network is workgroup-based. According to the descriptions of the "password-protected sharing" I could find, the only people who can access the shared folder via the LAN are those who know the username+password for the "Bob" account. However a second computer on the LAN is able to view this shared folder by simply browsing to Computer A. They don't need to enter any passwords or anything. The only user account registered on that PC is called "Jim", and has a different password from "Bob". How on earth is computer B able to view this shared folder? Is the popular description of the "password-protected sharing" feature inaccurate / did I misunderstand it big time? P.S. There is a possibility that the password for "Bob" has been entered on that PC once, and possibly the "remember password" box was checked. I've looked in the "Credential Manager" on both computers and there is nothing saved anywhere.

    Read the article

  • SFTP ChRoot result in broken pipe

    - by Patrick Pruneau
    I have a website that I want to add some restricted access to a sub-folder. For this, I've decided to use CHROOT with SFTP (I mostly followed this link : http://shapeshed.com/chroot_sftp_users_on_ubuntu_intrepid/) For now, I've created a user (sio2104) and a group (magento).After following the guide, my folder list look like this : -rw-r--r-- 1 root root 27 2012-02-01 14:23 index.html -rw-r--r-- 1 root root 21 2012-02-01 14:24 info.php drwx------ 15 root root 4096 2012-02-25 00:31 magento As you can see, i've chown root:root the folder magento I wanted to jail-in the user and ...everything else by the way. Also in the magento folder, I chown sio2104:magento everything so they can access what they want. Finally, I've added this to sshd_config file : #Subsystem sftp /usr/lib/openssh/sftp-server Subsystem sftp internal-sftp Match Group magento ChrootDirectory /usr/share/nginx/www/magento ForceCommand internal-sftp AllowTCPForwarding no X11Forwarding no PasswordAuthentication yes #UsePAM yes And the result is...well, I can enter my login, password and it's all finished with a "broken pipe" error. $ sftp [email protected] [....some debug....] [email protected]'s password: debug1: Authentication succeeded (password). Authenticated to 10.20.0.50 ([10.20.0.50]:22). debug1: channel 0: new [client-session] debug1: Requesting [email protected] debug1: Entering interactive session. Write failed: Broken pipe Connection closed Verbose mode gives nothing to help. Anyone have an idea of what I've done wrong? If I try to login with ssh or sftp with my personnal user, everything works fine.

    Read the article

  • Mac OS X Lion (10.7) Drive Encryption

    - by Skoota
    My iMac has two drives (a 256 GB solid-state drive, and regular 2 TB hard drive). The Mac OS X Lion system is installed on the solid-state drive and, like many other users, I have moved my user profile folder onto the secondary 2 TB drive. However, as you may be aware, FileVault 2 on Mac OS X Lion (10.7) only encrypts the system drive. This leaves my data drive (containing my user profile folder, with all of my data) unencrypted. I am aware that work arounds for this issue exist (such as https://github.com/jridgewell/Unlock) but I am not happy with the results since they involve decrypting the data drive on startup using a LaunchDaemon (before any users have logged into the computer) essentially meaning that any user who logs onto the computer will see the unencrypted drive. I would like a method which will only unencrypted the data when an authorised user logs into the computer. As such, is there a way to do one of the following? Encrypt the entire data drive and only decrypt the drive when an authorised user logs into the computer. This would be equivalent behaviour to the Lion FileVault 2 feature, but on a secondary drive rather than the system drive. Encrypt only the user profile folder on the data drive, and only decrypt the folder when the user logs into the computer. This would be equivalent to the behaviour of FileVault 1 on previous versions of Mac OS X? I am happy to pay for a commercial third-party product that provides the required feature(s), but I have not yet been able to find one. Thanks in advance for any assistance.

    Read the article

  • How to sync two computers using new MobileMe calendar

    - by CesarGon
    I have been using MobileMe for over a year with success. I use it to sync my Outlook calendars in my work and home computers, using Windows 7 and Outlook 2007. The main Outlook calendar folder in my work computer is replicated to MobileMe as "Work", and synced to my home computer, and the main calendar folder in my home computer is replicated to MobileMe as "Home", and synced to my work computer. This means that I can see both "Work" and "Home" calendars from both computers (as well as from the web interface through me.com), which is very convenient. Yesterday I migrated to the new MobileMe calendar, accepting the suggestion that popped up on the me.com website. After the migration, the MobileMe control panel on each of Windows computers asked me to re-configure my calendar setup, and everything fell apart. The "Home" and "Work" calendar folders in Outlook are now ignored by MobileMe, and new ones named "Home in MobileMe" and "Work in MobileMe" have been created, and placed in a separate Outlook data file rather than the default. This means that now: I now have four folders, two of which are not replicated to MobileMe The two folders that are not replicated reside on a separate data file, so alarms and reminders don't work; they're basically useless to me as calendar folders In addition, the button in the MobileMe Control Panel that used to let me specify what MobileMe folder should be synced against the default Outlook folder has gone. MobileMe is now too smart. Do you have any idea how to undo this mess and go back to a situation where I have two folders, as described in the top paragraph, which keep synced? I don't want an extra data file. Thanks.

    Read the article

  • Syncing Exchange Inbox subfolders to Blackberry

    - by Andrew W.
    Hello - I have a Blackberry user that has organized his Outlook such that he manually sorts mail messages into specific subfolders. Example - |--> Inbox |---->Mail from 2004 ( Subfolder of Inbox ) |--------> Business ( Subfolder of Mail from 2004 ) |--> Personal |--> Travel |--> Mail from 2003 When using the Blackberry this user is able to view the folder, however the folder is empty, despite knowing the folder has mail messages in it on the Exchange server. The handheld appears to sync without issue. The user receives new mail messages. I have Folder Redirection enabled and all folders checked marked. I am using BES 4.0. So, I guess my questions is this - How are the Outlook subfolders sync'd with the Hnadheld? Additionally, if a mail message is on the handheld is moved into a subfolder from the handheld, will it be sync'd on the Exchange server? Thanks in advance!

    Read the article

  • Managing multiple IMAP accounts in Thunderbird

    - by baritoneuk
    I've been using Thunderbird for years without issues with 20+ pop3 accounts. I'm moving over to imap which will enable me to keep copies of the emails locally and on the server whilst keeping everthing synchronised. However I'm looking for the best way to manage multiple imap accounts on Thunderbird. Currently I have a filter that copies all the emails into a central inbox and into seperate local folders. The reason for this is I go through my inbox daily and delete all emails that don't require any action. I move any emails that require action to my "action" imap account folder. This way I can syncronise all the emails that require action across multiple computers (and mobile devices). This technique is my implemantion of the GTD or Getting things Done philosophy. I also copy over each email into seperate local folders. The reason I do this is just in case any emails on the imap accounts get deleted, or something drastic happens on the server which means I lose all the emails. My business partner has access to some of these emails and still uses pop3 (with "leave copy on server" checked), but I know sometimes Thunderbird can still delete emails off the server sometimes. The problem with the above is that thunderbird gives me the dreaded error dialogue saying that the emails cannot be filtered due to another process. I find the folder list in Thunderbird hard to manage. Here is a screenshot of part of my folder list- as you can see it's a bit of a complicated list and not easy to manage: What would be the best way of me managing multiple imap accounts whilst allowing me to have copies put in a central folder and emails in local folders? It would be useful if people think this is necessary, as perhaps there is a betterway? How do people manage multiple imap accounts in a way that allows them to keep on top of actionable emails? I'd be interested in how others manage this. I've never used the Thunderbird-based client "Postbox", does this handle multiple imaps better?

    Read the article

  • Google Apps Sync bloated PST file to 14GB

    - by James S
    Back story: I have Outlook connected to my Google Apps email and noticed that some mail never got migrated from my original PST file. I found some VBA code online that compares mail in different PST folders, modified it to find missing and copy those to the target folder. I ran it folder by folder and moved missing mail. Before the exercise the Google Apps PST was about ~4GB and after it was ~4.7GB. Problem: I left Outlook open so Google Sync can copy it online. 24 hours later the Google Apps PST file bloated to 14GB+ and none of the mail has been synced to the cloud. I know that there should be at most ~5GB of mail. Why is the rest of the space being taken up? Funny thing is Gmail shows 3GB as being used online. What I tried: I emptied the deleted items folder and recycling bin I've run Outlook compact PST and it didn't work. I tried SCANPST.exe on the PST and it didn't work. I re-ran compact PST and it didn't work (after SCANPST found and fixed a few errors) Any ideas out there on what caused the problem and how to solve it?

    Read the article

  • Cannot find network path for computer in workgroup of home Windows XP PCs

    - by John Galt
    VMWare Workstation 6.5 is running as an app on a Windows Vista 64bit PC host. Thanks to Workstation we have 2 guest machines running: TerriVM and MattVM (both of these run Windows XP SP2). We are attempting to get virtual networking configured so we can access the files of both of these VM guest systems from other real PCs connected to this home network. We think we are close but we can't quite get it right... Here is what we've done so far: * On VM Workstation, we set "Host Virtual Network Mapping" to use VMnet0 with the setting "Bridge to an automatically chosen adapter". * On each VM guest (i.e. using Windows explorer on XP), we rightmouse on the C disk, click "Sharing" tab, set shareName to "C_Disk" and check both boxes labeled "Share this folder on the network" and "Allow network users to change my files". Symptoms: On "JohnsRealXP" PC, we go to Windows Explorer, My Computer, Map Network Drive, type into Folder textbox: \TerriVM\C_Disk and assign drive letter T. We see all the folders on this shared drive and can open files on them. So that is good. On same "JohnsRealXP" PC, we go to Windows Explorer, My Computer, Map Network Drive, type into Folder textbox: \MattVM\C_Disk and assign drive letter M. We get a message box "_The network path \mattvm\C_Disk could not be found_". Alternatively, we type just \mattvm\ into the Folder box and click "Browse" and get a dialog box where we drill down from "Entire Network" to "Microsoft Windows Network" to "Workgroup" where both TerriVM and MattVM are listed as computers on the network. Clicking the + sign next to MattVM gives an hourglass and never enables the OK button and I have to cancel. In summary, I think we've attempted to share both of these virtual machines using the same techniques and connect to them in similar fashion, but one connects properly and the other machine can be seen but no shared resources on it can be accessed. Can anyone suggest something possibly overlooked or something to try? Thanks so much in advance.

    Read the article

  • What is easiest no fail way to publish asp.net app?

    - by Maestro1024
    What is easiest no fail way to publish asp.net app? Sorry a bit of an open ended question but I am having issues deploying an asp.net report project and any solution to get the site up is fine. I am running Win7/SQL 2008 and want to publish a asp.net report site that I created in VS 2008. Website launches when I run in debug in Visual studio but I want to publish the site so that it can be seen on the LAN. I published the files off to a folder and started up the IIS manager and added a new site and pointed to that folder. Set the permission on the folder to share to everyone. However when I go to the DNS name I put in for the website it does not launch. Any ideas on this? I see websites out there talking about a web sharing tab on the folder properties but I do not see that when I go to folders. Why might that be? Another avenue I have not pursued yet is publishing directly to a website. Has anyone tried that? Is that better or worse than publishing to filesystem?

    Read the article

  • how to prevent other computers from seeing our network computers through vpn

    - by Disco
    We have a local office domain consisting of Windows 7 and XP machines that is running on Windows Server 2008 R2. We also have users that connect via VPN into our network. My concern is that when a remote user opens up a folder, the Network section on the left side of the folder shows the remote user all the computer names in our local network. I would like to go about renaming our computers in the local network with more descriptive computer names, but I do not want the users off-site to be able to see these computer names by simply opening up a folder. (Granted, they can already do this, but our current naming scheme does not link computer names to users.) I would like to change our computer names so we can determine which computer belongs to which user more easily IF it can be done securely. How can I ensure that our local computer names are not showing up in the Network folder for remote, VPN-connected users? My online searches have turned up results where people are advised to turn off Network Sharing and Discovery, but that seems to only ensure that the local machine doesn't see other computer names. I want to prevent OUR computer names from showing up on OTHER computers, and I can't go into the VPN-connected computers and turn off THEIR Network Discovery settings. I would think there is a group policy that would control this but I have not found one yet and I don't know how I would apply it to VPN-connected computers. Thanks! EDIT: That's true, a Group Policy wouldn't run on users only connecting via VPN, good point. What about a VPN/router policy, then?

    Read the article

  • Maintaining "Portability" Between Linux and Windows 7

    - by lokheart
    I am using the following ways in my office's Windows 7 machine to maintain my "portabilibity" when disaster strikes and I need to switch computer while I have no luxury of time for reinstalling all my program to the new PC. a majority of programs I used are portable, mostly from portableapp.com, like notepad+, GIMP, even R, I extract them and store them in a folder in My document, in a structure similar to the default portableapp installation when they are installed to a thumbdrive only a few software that portable version is not available and I will install them as usual all of my working files are stored in a folder in My document I regularly backup them all using syncback, because this program can keep versioning of my backup, and the backup is stored in a portable drive. One day I need to switch my computer and the operation is relative simple for me: I just move the two folders mentioned above into the my document folder of the new PC, install those few "non-portable" program in it, and this is almost done, some minor hiccups can be solved by reinstalling the portableapp into the drive. Overall speaking it is a smooth process. I would like to maintain the same degree of "portability" in my home Linux desktop (Ubuntu or Mint, I'm still deciding), that is, if my Linux crash and I need to reinstall it again. All I need to do is the move the two folder back to the new Linux, and most of my work will be almost ready to be worked on again. But I don't know how to find a Linux-alternative of portableapps. Being a newer to Linux, can anyone tell me whether this is possible in Linux?

    Read the article

  • Hiding subfolders from users with Windows Server security

    - by Frans
    Using Windows Server 2008. I would like to allow all users to map to a common network drive and be able to browse it. But, I only want them to be able to see the subfolders they actually have access rights to. Is this doable? Example I have a share with two folders on it; \\domain\share\FolderA \\domain\share\FolderB With three different security groups, I would like to map a network drive for all three to \\domain\share. However, for group1, I want them to only be able to see FolderA, group2 should only see FolderB and group3 should see both. I am not just talking about denying access to the actual folder, which is easy enough, I don't want the user to even be able to see that the folder exists. In other words, when group 1 logs in and do "dir n:\" they should see N:\FolderA When group 2 logs in, they should see N:\FolderB and when group 3 logs in they should see N:\Folder A N:\Folder B My half-baked solution If I completely block access to the root then I can't map a drive to it. I can give everyone the traverse right which then allows the user to map a drive. However, if a member of group1 or group2 tries to go to "N:\" they get an access denied error. If they go to N:\FolderA (for group1) then it works. So, that sort of works, but it would be nicer if the user could actually browse to N:\ and just only see the subfolders they have access to. I am pretty sure I have seen this done but not sure how to do it myself. Any advice would be greatly appreciated.

    Read the article

  • New tomcat install on OSX choking on startup.

    - by baudot
    I've completed a fresh install of Tomcat6 on an OS X box that didn't have it before. It's behaved a bit strangely in other ways, but the current hang-up is that it won't start at all. In response to running startup.sh, the catalina.out log collects this error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/catalina/startup/Bootstrap Caused by: java.lang.ClassNotFoundException: org.apache.catalina.startup.Bootstrap at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) Other bits of strangeness noticed with this installation: the .sh scripts in the bin directory had no execute permission, and had to be manually chmodded. The log folder wasn't created, causing an earlier script crash. After I manually created the log folder, the startup script made it to this new error before failing. Running other scripts in the bin folder generates similar error messages involving NoClassDefFoundError. Bootstrap.java is indeed in the right place, though Bootstrap.class isn't in the same folder. For that matter, if any of the myriad class files for tomcat should have already been generated from their .java files, I haven't seen it.

    Read the article

  • How to move your Windows User Profile to another drive in Windows 8

    - by Mark
    I like to have my user folder on a different drive (D:) than my OS is (C:). Reading the following post I decided to give it a try. All went quite well, untill I found out that my Windows 8 Apps won't execute anymore (other than that I didn't noticed any problems). My apps do work, while using an account that isn't moved. In the eventviewer I've found error messages like these: App <Microsoft.MicrosoftSkyDrive> crashed with an unhandled Javascript exception. App details are as follows: Display Name:<SkyDrive>, AppUserModelId: <microsoft.microsoftskydrive_8wekyb3d8bbwe!Microsoft.MicrosoftSkyDrive> Package Identity:<microsoft.microsoftskydrive_16.4.4204.712_x64__8wekyb3d8bbwe> PID:<4452>. The details of the JavaScript exception are as follows Exception Name:<WinRT error>, Description:<Loading the state store failed. > , HTML Document Path:</modernskydrive/product/skydrive/App.html>, Source File Name:<ms-appx://microsoft.microsoftskydrive/jx/jx.js>, Source Line Number:<1>, Source Column Number:<27246>, and Stack Trace: ms-appx://microsoft.microsoftskydrive/jx/jx.js:1:27246 localSettings() ms-appx://microsoft.microsoftskydrive/jx/jx.js:1:51544 _initSettings() ms-appx://microsoft.microsoftskydrive/jx/jx.js:1:54710 getApplicationStatus(boolean) ms-appx://microsoft.microsoftskydrive/jx/jx.js:1:48180 init(object) ms-appx://microsoft.microsoftskydrive/jx/jx.js:1:45583 Application(number, boolean) ms-appx://microsoft.microsoftskydrive/modernskydrive/product/skydrive/App.html:216:13 Anonymous function(object) Using ProcMon, I see a lot of access denied messages, like these: Date & Time: 12-9-2012 9:32:20 Event Class: File System Operation: CreateFile Result: ACCESS DENIED Path: D:\Users\John\AppData\Local\Packages\microsoft.microsoftskydrive_8wekyb3d8bbwe\Settings\settings.dat TID: 2520 Duration: 0.0000149 Desired Access: Read Data/List Directory, Write Data/Add File, Read Control Disposition: OpenIf Options: Sequential Access, Synchronous IO Non-Alert, No Compression Attributes: N ShareMode: None AllocationSize: 0 Any idea how to solve this? I noticed that the app folders e.g.: D:\Users\john\AppData\Local\Packages\microsoft.microsoftskydrive_8wekyb3d8bbwe had a different owner than the old profile folder had. Old profile folder had john as owner where my new profile folder had the Administrators group as owner. Changing this didn't help unfortunately.

    Read the article

  • Why is the Windows 8 recycle bin using more space than it is allocated?

    - by oldmankit
    I ran WinDirStat to scan the contents of my hard drive. I was surprised to see that the $RECYCLE.BIN folder on my D: drive takes 26 GB of space. I emptied the recycle bin, refreshed the folder in WinDirStat, but it still takes 26 GB of space. I reduced the Maximum size of the recycle bin for this drive to 10000 MB for the main user of this computer, and disabled the recycle bin for the other user, and refreshed the folder in WinDirStat, but it still takes 26 GB of space. I ran (in an elevated window) rd /s D:\$Recycle.bin, and refreshed the folder in WinDirStat, and finally it became empty. Why was it taking up space even after I emptied it? Why was it taking more space (26 GB) than the maximum allowed amount (10 GB)? Update: After six months of using Windows (no re-install and no changes of settings related to the Recycle Bin) I used WinDirStat to check how big D:\$RECYCLE.BIN has become. It is now 29 GB. In Recycle Bin Properties, I select drive D, and it is still a custom maximum size (10000 MB).

    Read the article

  • Not able to access external Hard disk

    - by Jash Jacob
    I have a 1TB External Hard drive which I'm currently not able to access. When I open the External drive in Finder, It shows it's empty. When I use the option to "Get Info", I get the dialog box stating it has about 300GB Free. Tried to get into the External Drive using Terminal, I had no luck. Checking in Disk Utility, It showed that I have many number of files but ZERO folder. I tried to "repair disk", in the process the external Drive got unmounted in between the process. I checked this drive on Windows. I was able to open almost all the folders but I wasn't able to copy anything onto the external drive. One folder caused my windows computer to hang, So i connected the drive back onto my MacBook Pro and tried to access the drive through terminal (this time it worked!) and then I tried to delete the folder with rm command, I got an "input/output error" What should i do to recover the files in that folder? How can i access my external drive on my mac

    Read the article

  • Windows 7 search does not return results from indexed folders

    - by Dilbert
    I am experiencing this issue over and over again and I just cannot seem to find the answer. It doesn't make sense, but search simply does not return results from folders that certainly have these files inside. It's weird that this technology exists for more than 5 years now (it could be added to Windows XP as an addon), and they still haven't got it right. My folder contains 10 image files with .png extensions. Two scenarios: Scenario 1: I exclude the folder using Indexing options. Search works. Scenario 2: I turn on indexing for this folder. Search does not work. Of course, Agent Ransack returns results every time. When I check Advanced options for the Indexing options inside control panel, .png files are checked in the File Types tab, using the "File Properties filter". What's the deal with this? [Edit] To clarify, this doesn't happen with all folders, but does with more than one. For the "problematic" folders, even *.* doesn't return a single result. I found some advice to clear the archive and readonly attributes for all files (doesn't make sense, but hey), but it didn't work. Indexing status in Control panel is: Indexing complete. 100,000 items indexed. Folder is included in the list. File types list contains the .png extension (although it doesn't work with any filter, not even *.*).

    Read the article

  • How do I fix error 1303 during TI Connect install?

    - by smoth190
    I recently purchased a TI-84 Plus graphing calculator, and I'm trying install the TI Connect software in order to connect the calculator to my computer via the USB cable. Unfortunately, I'm getting this error while trying to install the program: Error 1303. The installation has insufficient privileges to access this directory: E:\Data\Timothy\Documents\MyTIData. The installation cannot continue. Log on as administrator or contact your system administrator. However, my account is the only account on my PC, and it has administrative privileges. I've also tried running the installer with Run as Administrator, but with no luck. If I create the folder MyTIData manually, I receive this error: Error 1317. An error occurred while attempting to create the directory: E:\Data\Timothy\Documents\MyTIData I've reapplied the security settings to the E:\Data folder (and all its sub-directories) to Full for my account. I've also gone into Computer Management, and given SYSTEM full privileges for the entire disk. I've also logged out, logged back in, restarted, etc. but still, no luck. Now, I should mention that my Documents folder is not at the default location. I changed it due to my C: disk being a 90GB SSD, so I moved all my personal data onto the extra storage disk (which is ~1TB). I don't know if that is causing the issue, but it can't hurt throwing it out there. So why can't I install this program? Google'ing the problem brings up this error for various other installers (such as Visual Studio and Microsoft Office), but nothing for TI Connect. All the solutions are the same: Give the folder Full privileges...but I've already done this! I've also tried running the installer with and without the calculator plugged in, but it didn't change anything. In the prompt that contains the error, repeatedly clicking Retry or waiting a few moments before clicking Retry also produces no result.

    Read the article

  • How to avoid sshfs freezing?

    - by Andreas Hagen
    So the issue is this: I've installed sshfs on Ubuntu 12.04 and I'm trying to connect to a couple of remote servers. So initially the mount seams successful. Sometimes Gnome even picks it up and displays the "new device found" box at the bottom of the screen. but from here on there is not much that works. Or at least not any more. The first couple of times i connected it seamed to work fine, and I was able to transfer some files, then i disconnected using fusermount -u <folder> and after reconnecting a little later the trouble started. Now after executing sshfs -o ServerAliveInterval=15 -o reconnect -C -o workaround=all -o idmap=user root@<host>:/ <folder>, when I change directory into the mount-point, the shell just freezes. Strangely ls -al <folder> works when listing just the root of the remote system, but nothing more. Also every file-explorer I've tried freezes just like cd <folder>. To me it seamed like there was some kind of zombie thread or something hanging around my system, due to the fact that it did work the first time, so I have tried rebooting but no luck. sshfs -V gives this: SSHFS version 2.3 FUSE library version: 2.8.6 fusermount version: 2.8.6 using FUSE kernel interface version 7.12 So yea, any ideas?

    Read the article

  • Shortcut to "printer and faxes" on another computer

    - by Doltknuckle
    I have a print server running windows server 2008 that has about 50 printers on it. In windows XP, I was able to connect to the server using the UNC name and make a shortcut to the "printers and faxes" folder. (For the record, I know that it really isn't a folder, but that's outside the scope of this question.) I have recently switched to windows 7 and I find that the jump lists are really useful. One of the things I want to do is make it easy to connect to that server's "printers and faxes" folder. I would like to use something like a shortcut that I can open and go immediately to that location. The problem is that windows 7 doesn't have a way to create a shortcut like you could in WinXP. They have a button on the toolbar that says "view remote printers" which sends you to the correct folder. I'd like to avoid having to type out the server name. I also can't use the "view network" link in windows explorer. Our organization has over 6,000 machines and viewing the network lists all of them. This is all about saving time by using the minimum number of mouse clicks and key presses in normal operation. Does anyone have any suggestions?

    Read the article

  • Good process/software for organizing photos past/present

    - by Matthew
    So I have tons of photos taken all the time. I have a lot from years past that I never went through (meaning deleting duplicates, etc). I've got a new pc with windows 7, and I'm wondering what a good process is to organize those photos. They're in folders that have really no meaning (it used to be people would put them in a folder wherever, even the desktop or somewhere else, not just the My Pictures folder). I'm going to keep all pictures in the "My Pictures" folder from now on. I've used Picasa from g=Google, and it works great. Is this the recommended free software for this? What process do I use to move the old pictures over in to new "organized" folders? Lately in Picasa when I import off my camera card, I would just select something that names the folder after the date it was taken. Is this advised? Just give me ideas on how to stay organized with photos. Should I tag them also? Should I rename the file names? Keep in mind I have over 16,000 photos I'll have to go through, so it can't be anything to thorough.

    Read the article

< Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >