Search Results

Search found 39577 results on 1584 pages for 'temp files'.

Page 94/1584 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • "An error has occurred" when opening xlsx files on Sharepoint 2010

    - by Mike Messina
    We just recently installed and have been using SharePoint 2010 and have run into a bit of a problem opening Excel spreadsheets with the xlsx extension. We are able to upload a spreadsheet with the xlsx extension, however, when we attempt to download the same spreadsheet we are getting the following error: Excel An error has occured. Please try again. We can open xls files fine as well as docx and pptx files.

    Read the article

  • Windows - How to remotely watch log files

    - by weismat
    I would like to look at some log files solely via the console on a standard Windows 7 machine. The logs are created by schedulded tasks and I find it a hazzle to use VNC for this purpose. What technology should I look at? Powershell, Cygwin via ssh or something else? The log files are written using log4Net - thus there might also be an easy way to reconfigure it to create events or something else for remote display.

    Read the article

  • serving files via subdomain

    - by muntasir
    does serving files like images via subdomain speed up loading speed. i read somewhere that static files like images, java script served via cookieless domain does mean something regarding website speed.

    Read the article

  • Apache returning text/html on some png files

    - by Oren
    I have an Apache web server that has a sub folder for images. For some reason, a few of the .png files are returned as text/html and not as image/png. There is nothing indicating a permission problem and the files return with code 200 and with full size. I made sure that image/png is set and even tried forcing it with .htaccess Any idea where to look next? Edit: looks like an .htaccess configuration problem on a parent directory.

    Read the article

  • Why Can't I Pre-Zip Server Files?

    - by ThinkBohemian
    It's just good common sense to have your server gzip your files before they send them to users (I use Nginx) Is there anyway to save the server some overhead and pre-zip those files for the server, and if not why? For instance rather than giving the server an myscript.js and having the server zip the file and send it to the user, is there a way to create myscript.js.zip so the server doesn't have to?

    Read the article

  • Mass deleting files in windows

    - by aaginor
    I have a directory, that contains ~ 3 million files in certain subdirectories on a Windows 2008 server. Manually deleting the files via SHIFT+DEL on the root dir takes ages. Is there any other way to do the deletion in a faster manner?

    Read the article

  • delete duplicate files with windows batch file?

    - by Chris Sobolewski
    I have a program that automatically copies files to a directory, and if it creates a duplicate it will name it like so: file with duplicate.xxx file with duplicate - 1.xxx file with duplicate - 2.xxx I need a way to delete all the duplicates with a windows batch file. Something along the lines of: FOR %f IN (C:\files\*.*) DO del "%f - 1" However, that will not work because that would resolve to file with duplicate - 1.xxx - 1

    Read the article

  • How to protect files/folders from being copied/moved/deleted/cut on windows

    - by Sean Lee
    I need to share data on an external drive that will be handed over to someone else, and I would like to achieve the following: (1) protect all the files and folders from being copied/moved/deleted/cut on windows system (2) files are browsable and media playable, but it stays inside the drive (3) the same behavior if drive is plugged on linux system, or not accessible at all is fine too. How can I do these without using paid software?

    Read the article

  • Diff 2 files while ignoring parts of lines

    - by Millianz
    I would like to diff a file system. Currently my bash script prints out the file system recursively into a file (ls -l -R) and diffs it with an expected output. An example for a line in this file would be: drw---- 100000f3 00000400 0 ./foo/ My current diff command is diff "$TEMP_LOG" "$DIFF_FILE_OUT" --strip-trailing-cr --changed-group-format='%' --unchanged-group-format='' "$SubLog" As you can see I ignore additional lines in the current output file, I only care about lines that match with the master output. I now have the problem though that some files may differ in size, or a folder might even have a different name, but due to it's location I know what access rights it should have. For example: Output: ------- 00000000 00000000 528 ./foo/bar.txt Master: ------- 00000000 00000000 200 ./foo/bar.txt Only the size differs here, and it doesn't matter, I would like to just ignore certain parts of the diff, kind of like an ansi c comment. Master: ------- 00000000 00000000 /*200*/ ./foo/bar.txt -- OR -- Master: d------ 00000000 00000000 /*10*/ ./foo//*123123*///*76456546*//bar.txt Output: d------ 00000000 00000000 0 ./foo/asd/sdf/bar.txt And still have it diff correctly. Is this even possible with diff, or will I have to write a custom script for it? Since I'm fairly new to cygwin I might be using the completely wrong tool all together, I'm happy for any suggestions. Update: Taking a step back, here is the general task at hand that I want to achieve. I want to write a script that checks the file system to see if the read/write permissions are set up correctly. The structure of the file system is under my control, so I don't have to worry about it changing too much. Sometimes folders/files might not be present, but if they are their permissions must be checked. For Example assume that the following is a snapshot of the current file system structure drw ./foo drw ./foo/bar -rw ./foow/bar/bar.txt drw ./foo/baz -rw ./foo/baz/baz.txt And this is what the file system structure might dictate, i.e. if these folders / files are present, the permissions must match. drw ./foo drw ./foo/bar -rw ./foo/bar/bar.txt --- ./foo/bar/foobar.txt drw ./foo/baz -rw ./foo/baz/foobaz.txt In this case the file system checked out ok, since all files present match their expected values. The situation becomes more complicated as soon as certain folders might have any arbitrary name, only due to their location I know what their permissions should be. Assume that the directory ./foo/bar in the above example might be such a case, i.e. instead of bar the folder could have any name, but still match the -rw permissions. This seems like a very complicated situation, and I'm not even sure if I can solve it with bash scripting alone. I might have to write an actual application.

    Read the article

  • rsync --files-from (find + cat)

    - by Edward
    I try command rsync -v --files-from=/path/to/list.lst /home/user /path/to/backup list.lst contains for example .gnupg/ .pki/ .gnome2/keyrings/ .mozilla/firefox/*.default/bookmarkbackups/ .mozilla/firefox/*.default/bookmarks.html .mozilla/firefox/.default/.db files .mozilla/firefox/.default/.sqlite and i get error on all strings with * "failed: No such file or directory". Can anybody help me, or as variant can i combine find `cat /path/to/list.lst` with rsync?

    Read the article

  • Where are Outlook 2010 Email Settings, not the pst data files

    - by user64908
    I've found all my Outlook data files which contain all my emails at the following paths: C:\Users\C\AppData\Local\Microsoft\Outlook C:\Users\C\AppData\Roaming\Microsoft\Outlook and C:\Users\C\My Documents\Outlook Files\ I've migrated all these and copied it to my new machine at the same directories however my configuration is not there! None of my email pop/smtp settings are there, only the pst is loaded with my emails but all other configuration is gone, where precisely is that configuration stored?

    Read the article

  • Log all files saved on XP system.

    - by Jason Taylor
    I have a user that frequently saves items (or even forgets to save) to places that he forgets. Usually a simple search finds them, but not always. Is there any way to log/track the most recently saved files? It would be great to be the last "saved" files as the recent documents feature is unreliable if he constantly opens documents in his search for the file he just saved. Alternatively, any ideas on how to control this situation?

    Read the article

  • Understanding exim configuration files

    - by bobobobo
    So, I want to understand what's going on with this Exim configuration directory. In /etc/exim4, there's: * exim4.conf.template * update-exim4.conf.conf * conf.d The conf.d has a mess of directories and files, and inside each are a bunch of if statements which I find really different. For example: maildir_home: debug_print = "T: maildir_home for $local_part@$domain" driver = appendfile .ifdef MAILDIR_HOME_MAILDIR_LOCATION directory = MAILDIR_HOME_MAILDIR_LOCATION .else directory = $home/Maildir .endif .ifdef MAILDIR_HOME_CREATE_DIRECTORY create_directory .endif .ifdef MAILDIR_HOME_CREATE_FILE My question is, where do the CAPS VARIABLES get defined how can I change them why are there so many if statements in these configuration files?

    Read the article

  • Find largest directories/files recursively

    - by Robert Munteanu
    I'm looking for a script/program which will display the top x largest directories/files and then descend into those folders and display the x largest directories/files for a configurable depth. 231MB bin - 220MB ls - 190MB dir - 15MB def - 3MB lpr - 10MB asd - 1MB link How can I do that?

    Read the article

  • Looking for good program for cataloging DVD, Video files, ebook

    - by Stecy
    I'm looking for a windows program for cataloging my DVD, video files, ebook, etc... The program must have some retrieval of information from the internet and from the files themselves. Also, one MUST HAVE requirement is that if a file/ebook changes location, I need to be able to change it in the software. Even better if the program allows bulk updating... Update Order of preference is: open-source, freeware, shareware, commercial.

    Read the article

  • What are the A0xxxxxx.DLL files?

    - by Joel Coehoorn
    If you've ever watched a windows computer defrag a drive, you might have noticed that many of the files that are fragmented and need fixed have names like A0833773.DLL. If you know regular expressions, I could express the filename this way: A\d{7}[.]DLL Does anyone here know what those files are or what they're used for?

    Read the article

  • Sort music files by disc number in Vista

    - by furikuretsu
    I'm sitting under Vista and the problem is -- though I can see disc number of track in Winamp or some other multimedia players, I cannot sort files by disc number in Windows Explorer. I've scanned through the whole long list of available properties of files but haven't found "disc number" or similar property. So is there any way to sort tracks by disc number in Windows Explorer, and if there is, then how to do it?

    Read the article

  • rsync not writing files

    - by Cyrcle
    I'm trying to setup rsync to backup a remote directory to my local drive. I cd to the directory that I want to pull the files to, then I enter: rsync -vrtW [email protected]:~/public_html I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing? Thanks

    Read the article

  • Sharing files between multiple sites using only desktop software

    - by perlyking
    Our organisation has three sites; a head office, where the master copies of company files are stored, plus two branch offices using only workstations and a NAS or two. Currently we're talking about <10GB. At the main office, we have no admin access to the file server, as this is entirely controlled by the larger institution where we are located. For the same reason, we have no VPN remote access to this network. Instead, we simply have access to a network share using over a Novell LAN. Question: how can we share files between offices in way that minimises latency, i.e. that gives us a mirror of the main network share at each site? (There is little likelihood of concurrent editing, and we can live with the odd file conflict now and again). Up to now branch office staff have had to use GotoMyPC-type solutions to remotely access files held at the main office. Or email. I was hoping to use Google Drive on a dedicated workstation at each office to sync the contents of the network share (head office) or NAS (branch offices) via the cloud, but at my last attempt (29 Jun '12), the Google Drive installer would not allow me to designate the remote network share as the "target" folder. (I chose Google Drive over Drobbox et al. as we already use GMail for corporate mail) The next idea was to use a designated workstation at head office to mirror the network share to a local drive, then use Google Drive to push that to the cloud. This seems a step too far. Nor do I have any good ideas about how to achieve this network/local mirroring, as we can't, for example, install the rsync daemon on the server. I do not want to use Google Drive locally on each workstation as this will inconvenience users, and more importantly, move files off the backed-up, well-maintained (UPS, RAID etc) network share at head office. Our budget is only in the £100's. Should we perhaps just ditch the head office server and use something like JungleDisk? At least this presents the user with what appears to be a mapped drive.

    Read the article

  • Rolling Apache2 log files

    - by Andrew B
    I'm using a Collabnet svn distribution on linux, and the log files are configured through the standard apache httpd.conf. It's been a while since I dealt directly with apache, but my memory and google seem to indicate that the only way to rotate apache log files is outside of apache, using a periodic script. Is there some convenient way I'm missing to rotate these?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >