Search Results

Search found 51448 results on 2058 pages for 'log files'.

Page 105/2058 | < Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >

  • Stand alone or free application to backup ADAM / AD LDS database files

    - by Darqer
    Do you know any small standalone and free tool, that can be run in console, to backup / restore ADAM / AD LDS database files (like adamntds.dit, edbres00001.jrs etc.). I tried to stop ADAM service and copy / paste these files to other location but afterwards I was unable to restore ADAM from these files. I know I could use on ws 2003 some backup tool that was provided by microsoft but it seems to be unavailable on ws 2008.

    Read the article

  • Windows Media Player - Display JPEG cover art for FLAC files

    - by pelms
    I got WMP 12 (on Windows 7 RC1) to play FLAC files by installing the Ogg Vorbis/FLAC, Direct Show filters, but WMP does not display the embedded cover artwork. After experimenting I found that it will display the cover art if it is embedded in PNG format, but up to now I've used JPEG for all my FLAC files. Rather than re-tagging all my files, does anyone know a way to get WMP 12 to display JPEG embedded cover art?

    Read the article

  • Windows 7 "Backup completed but some files were skipped"

    - by Andrew Coleson
    I set up Windows 7 Pro to backup my files to a network path (woohoo!) and chose to backup "data for newly created users, libraries", and my user folder (no system image). All went fine (although the first backup took ~12 hours for some ridiculous reason), but at the end it gave me a message that "Your backup completed, but some files were skipped. Click to see which files." I checked and the "files" skipped were my 3 network-mapped drives, which is perfectly fine and reasonable behavior (I certainly don't need it to back up my network-mapped drives as part of my local PC backup), but in the Backup and Restore center it warns me that my Last backup was "Never" and the Action Center now has a permanent "Check your backup results" issue. Is there any way to set up the backup to exclude the network-mapped drives or tell it that I really don't mind that it skipped drives I never asked it to back up?

    Read the article

  • IIS7 Compression CSS files only compressed when dynamic compression is enabled

    - by Paul
    If anyone can help it would be appreciated. I would like to enable compression for static files within IIS7 (for the sake of simplicity I'll just refer to static css files for the time being). The problem I'm getting is that css files are only compressed when both dynamic and static compression is enabled in IIS for the website. What I really want to achieve is css compression (static file) whilst leaving the dynamic (aspx) pages as uncompressed for the time being (to avoid unnecessary CPU load). I am puzzled as to why just leaving 'static compression' enabled causes css files to be returned uncompressed. My applicationHost.config file has not be altered and looks like this: <httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files"> <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" /> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </staticTypes> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> </httpCompression> The server-wide compression setting within IIS is set to 'Dynamic Disabled' and 'Static Enabled' from the Server Features Compression page. The web-site compression setting (Server Sites MyWebsite Features Compression) is where I am enabling and disabling dynamic compression as detailed above. Any help would be really help me get unstuck on this. Thanks

    Read the article

  • What files to backup on Lighttpd+MySQL+PHP server

    - by Tomaszs
    I have a VPS with CentOS 5. I would like to create backup of: all my config files tweaks of database, php, server a databases cron settings website files installed applications and their settings (?) What files should i take into account? I don't want to miss any file that will be necessary to restore fast my webserver in case of any failure. And I don't want to create whole backup because entire VPS has like 30 GB of data.

    Read the article

  • Zabbix Trigger for SELinux (type=AVC) Errors

    - by Kevin Soviero
    I would like to create a trigger in Zabbix to alert me anytime a type=AVC error appears in a CentOS 6 server's /var/log/audit/audit.log file. I've already tried creating a basic log scrape. E.g.: log[/var/log/audit/audit.log,type=AVC,"UTF-8",100] However, it does not work. I believe this is due to the /var/log/audit/audit.log and it's parent folder using the following permissions: drwxr-x---. 2 root root 4096 Apr 20 04:29 . drwxr-xr-x. 13 root root 4096 Apr 14 12:07 .. -rw-------. 1 root root 5948185 Apr 20 15:27 audit.log -r--------. 1 root root 6291566 Apr 20 04:29 audit.log.1 -r--------. 1 root root 6291704 Apr 19 16:56 audit.log.2 -r--------. 1 root root 6291499 Apr 19 05:22 audit.log.3 -r--------. 1 root root 6291552 Apr 18 17:48 audit.log.4 I would prefer not to change the permissions for security reasons. Has anyone done log monitoring of /var/log/audit/audit.log using Zabbix? And if so, how?

    Read the article

  • Open Garmin GPI files in Linux

    - by zero
    i have several files that are in the Garmin GPI format, from here, and want to access them in Linux. How can I do that? hello dear renan - well i found the solution - so thanks for your try to add some sense to the question. GEREAT work - and here we have the answer: i will Install gpsbabel to convert the stuff. The gpi.files are just xml files, You can - after converting with gpsbabel read them as a human being.

    Read the article

  • Access to files on Windows 2003 server from Mac

    - by Clinton
    Hi, I can access the Windows 2003 server from my Mac, but can't see all the files. I am using Snow Leopard and have followed fixes online. All get me connected and I can see all folders, but not all files within these folders appear. Someone else using a Mac can see all files which is very odd. Any help much appreciated. C

    Read the article

  • rsync doesn’t sync files

    - by modi
    Hi I'm using rsync(with Cygwin) to sync 2 local folder The folder contains binary files I'm using the following command rsync.exe -av dir1/ dir2/ but the files in dir2 where only partially update, there are few different files does anybody know of a problem with rsync on windows? should i use some other flags 10'xs

    Read the article

  • Files not running and folder named C:\windows\restop found with system files in it on windows 98

    - by Max
    I have an old Windows 98 machine that I started using for some stuff a few days ago. Today I noticed that I can't run many system files, so I checked my system folder and I noticed that most of the files are gone. After doing a search for them I found them in a folder in C:\windows called "restop". I don't really feel comfortable restarting because all the files are moved. Does anyone know what might've caused this or if it's safe to restart? Is there some special way to move the files back?

    Read the article

  • Extract InstallSHIELD's CAB files when there is no HDR files?

    - by user433531
    layout.bin setup.lid _sys1.cab _user1.cab DATA.TAG data1.cab SETUP.INI setup.ins _INST32I.EX_ SETUP.EXE _ISDEL.EXE _SETUP.DLL lang.dat os.dat I want to extract an InstallSHIELD's 5 install package and above is the list of files in "data1" folder. However there is no *.hdr files so I can't extact the CAB files using tools on Internet, even though the package is still able to be installed without any error. Can anybody give me a suggestion for this please?

    Read the article

  • Unable to copy files previously extracted from archives created on a Mac, even after claiming ownership

    - by Maxim Zaslavsky
    I reinstalled Windows on my computer today, and backed up my music to a USB drive. Now, I'm trying to copy the files onto my fresh Windows partition, but I'm unable to copy files that I obtained within my previous Windows installation from zip archives created on Macs. When I try to copy those previously-extracted files, I get an error saying that I need permission from S-1-5-21-...-1000 (a bizarre long ID). The first thing I tried was to take ownership of the files by setting my new user account as the owner, but that resulted in errors saying that I need permission from myself! Some Googling suggested adding antivirus suggestions, so I excluded the relevant folders from Microsoft Security Essentials, but the issue persists. For what it's worth, it seems that some program (so far I've only installed Chrome, Microsoft Security Essentials, and the latest Windows updates) created an empty folder named 601c8c7f0e0c03f725 at the root of my external USB hard drive. What gives?

    Read the article

  • proftpd: copying uploaded files to an additional directory

    - by Matthew Iselin
    Using proftpd, is there a good way to automatically synchronise uploaded files from the upload directory to some other directory? Our layout ends up a bit like this: ~/ftp/some/path <-- Files are uploaded here ~/some/other/path/not/accessible/via/ftp <-- But also need to be here after uploading Is there a good way to do this automatically, or do I have to tell anyone uploading files to upload twice, and open up an additional directory (containing data we cannot redistribute)?

    Read the article

  • Using NginX and Apache alongside for both static and dynamic files

    - by faridv
    Background: I've searched a lot and found these useful threads about using of Apache or NginX for static or dynamic files. But they are old (mostly about 1 or 2 years ago) and I think both webservers, specifically Nginx has had important changes in performance and usage. So I think take another look on these issue cannot be that bad. Nginx (for static files) and Apache (for dynamic content)? nginx better than apache for dynamic content? [closed] Apache or NGINX for PHP? Nginx as reverse proxy to Apache with only dynamic content? My question: I have a PHP web application with lots of dynamic files and lots of static contents (videos, images etc.) and it's currently running on a CentOS 6 server and Apache 2.2 since 2 months ago. In past few days, number of our site visitors have gained so fast and I just thought if this number continues to increase with current ratio, we need to change many things (web server, application, etc.) to prevent failures. Because of hardware limitations that we are facing, I thought that it's best for us to start with web server. Should I start with something else? Should I try to increase performance of my PHP application and forget about web server for now? (even if gonna take a long time!) Because of huge usage of .htaccess files (for redirection, rewrites, etc.), I think it's gonna be painful to migrate to NginX as default web server or maybe only for dynamic files. Does this mean that I can't even use Nginx as reverse proxy? I'm not sure latest stable version of NginX and PHP-FPM have a better performance over my current Apache and my limitations (too many things) won't let me to give it a try. Which one is doing better currently? What will I lose by migrating to Nginx? To make it short, what should I do?

    Read the article

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    I submitted this to stack overflow (here) but realised it should really be on serverfault. so apologies for the incorrect and duplicate posting: Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • Linux command to concatenate audio files and output them to ogg

    - by hasen j
    What command-line tools do I need in order to concatenate several audio files and output them as one ogg (and/or mp3)? If you can provide the complete command to concatenate and output to ogg, that would be awesome. Edit: Input files (in my case, currently) are in wma format, but ideally it should be flexible enough to support a wide range of popular formats. Edit2: Just to clarify, I don't want to merge all wmas in a certain directory, I just want to concatenate 2 or 3 files into one. Thanks for the proposed solutions, but they all seem to require creating temporary files, if possible at all, I'd like to avoid that.

    Read the article

  • Windows Batch Scripting: Newest File Matching a Pattern

    - by Eddie Parker
    This would be lightning quick in linux, but I'm not familiar enough with windows flavour of batch scripting. Basically I want to look for a series of files matching a certain wildcard, and get the one with the most recent modified date. I've gotten as far as: for %%X in (*.exe) do ( REM Do stuff.... ) But I'm not sure what manner of comparison operators there are, or if there's a better way of doing this. Anyone offer up any good solutions? Ideally it would involve a vanilla install of Vista; so no special trickery like cygwin/etc.

    Read the article

  • How to crash a program

    - by user2949019
    I have a program called BlueCoat Proxy installed on my school issued laptop that basically blocks every second website on the Internet, including stack exchange, YouTube and yahoo answers. I do not have administrator rights, nor can I delete anything in program files, I tried every possible method of obtaining admin rights. It is not accessible in task manager (it doesn't even appear there). I tried to close it with Windows command prompt through commands like 'taskkill' but it returns 'Access is Denied' (I'm only denied access with that program). Does anyone know a method of crashing a program with a batch file or VB program? I was thinking something like the ping command, though for a program. Maybe automating 1000 meaningless requests to the program? Your input on the subject matter is appreciated, however telling me that this is wrong or illegal is not.

    Read the article

  • Windows 7 - Non Admin run as Admin for Explorer - still can't see all tmp internet files

    - by Steve
    I'm trying to retrieve video files from the IE 8 cache for a user that's not an admin in Win 7. As a non admin user, I run Explorer as admin and still can't see the temp internet files for the non admin user. Only if I login as a user that is admin can I see the files. Is there any way I can see the files w/o having to go through the login process? Essentially, I want the video file from this page and others like it: http://video.yahoo.com/watch/111585/1027823

    Read the article

  • How to backup millions of small files?

    - by grassbl8d
    What is the best way to backup millions of small files in a very small time period? We have less than 5 hours to backup a file system which contains around 60 million files which are mostly small files. We have tried several solutions such as richcopy, 7z, rsync and all of them seems to have a hard time. We are looking for the most optimal way... We are open to putting the file in an archive first or transferring the file to another location via network or hard disk transfer thanks

    Read the article

  • Using rsync when files on one end are all lowercase

    - by DormoTheNord
    I want to rsync a lot of files from a Windows box to a Linux server. The problem is, the files on Windows are all mixed case, and the files on the linux server need to be all lowercase. One solution is to have a script that rsyncs to a different directory on the server, copy the files into the main directory, and then convert them all to lowercase. I'd rather find a more elegant solution, though. I'd prefer a command line application, but I'd be willing to go with a GUI application if that's the best option.

    Read the article

  • Recovery of Pinnacle Studio Project Files

    - by seanieb
    My external hard drive had some sort of issue a few months ago, but I was able to recover my files using a data recovery software program. However my Pinnacle studio files are not being recovered as before, they are being recovered as directory's/folders that have sub directory's and files. And I have tried with several different recovery programs and they all recover the projects as directories. And the projects all contain one file called README.TXT: * WARNING This directory contains the descriptive data of the project, split into. various subdirectories and files for better access. DO NOT EDIT, ADD, CHANGE OR MODIFY ANY OF IT'S CONTENTS! This gives me hope that I could some how just turn the directory into a .stu Pinnacle studio project file. How would I go about doing this? Or is there another way to solve this problem?

    Read the article

  • Problem opening files with Gvim and NERDTree on Windows 7

    - by Oscar Duignan
    Just installed gvim on windows 7 for the first time, and I'm having a problem opening some files. When I open files (I'm doing this through NERDTree,) vim seems to flash up a cmd window for a few second before closing it (too quickly to make out the contents,) and I end up with a C:/Program folder, and a /Files/vim/vimfiles/doc/ in the directory of the file I just opened. I can see it's trying to access C:/Program Files/vim/vimfiles/doc, which is where vim is installed, however it's choking on the space, and I'm not familiar enough with gvim to work out why. Any and all ideas are greatly appreciated.

    Read the article

< Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >