Search Results

Search found 72319 results on 2893 pages for 'file explorer'.

Page 995/2893 | < Previous Page | 991 992 993 994 995 996 997 998 999 1000 1001 1002  | Next Page >

  • Error in Installing MediaWiki for Ubuntu, Posgres 8.3

    - by Masi
    How can you solve the error message at the last line? .... # Installing MediaWiki with php file extensions # Environment checked. You can install MediaWiki. # Generating configuration file... # Database type: PostgreSQL # Loading class: DatabasePostgres # Attempting to connect to database "wikidb" as "wikiuser"... error: No database connection # Checking the version of Postgres... Warning: pg_version(): supplied argument is not a valid PostgreSQL link resource in /var/www/wiki/includes/db/DatabasePostgres.php on line 1078 FAILED. Required version is 8.1. You have 7.3 or earlier I am using Postgres 8.3 which makes the error message strange. The file "LocalSettings.php" was not created to the directory config so I cannot continue the installation without solving the problem.

    Read the article

  • Is it possible to mount a hot-swappable drive when it is turned on?

    - by John
    In my PC, I have a hot-swap drive. Usually I keep it off to save power. I only really use it when accessing from another PC on the network. Is it possible to configure /etc/fstab to mount this drive when I turn it on (without having to shake the mouse, open file manager and click the drive to have it mounted? Currently, I have: UUID=a869e5ca-7d3b-4d64-91e2-eadbecd8c9e5 /media/i-TVShows ext4 rw,nosuid,nodev,auto,user,uhelper=udisks 0 0 in my /etc/fstab file but it doesn't seem to do the trick. I want the drive to be user-mountable, on power on, with RW access, and I'm thinking of adding 'nofail'...this is my first time writing to the fstab file, and a lot of the parameters I took from the output of 'mount' so feel free to correct any oddness you find. Thanks

    Read the article

  • Nagios and rrd on a old server

    - by Pier
    I have an old server (P4 based) on which nagios (and all the other tools to monitor) is running. In the last few weeks we are seeing a strange behavior. In the /var/spool/pnp4nagios (where temporary files are stored before getting processed by pnp4nagios daemon) we have many files like perfdata.1274949941-PID-18839 and we get an error in npcd.log: [05-27-2010 11:17:46] NPCD: ThreadCounter 0/15 File is perfdata.1274951306-PID-27849 [05-27-2010 11:17:46] NPCD: File 'perfdata.1274951306-PID-27849' is an already in process PNP file. Leaving it untouched. Sometimes some graph are not drawn. The server is pretty loaded (around 5-6 normally) and i suspect that npcd goes in timeout and leave those files behind. What could I do (apart from change the server)? Few infos about the system: centos 5.5 nagios 3.2.1 pnp4nagios 0.6 (from sources) Thanks

    Read the article

  • named-checkzone reports 'ns.example.com.ns' has no address records (A or AAAA)

    - by hydroparadise
    The first thing I see wrong is that its a recursion problem. But I'm not sure where the problems lie in my reverse lookup file. ns should report back as ns.example.com but instead getting ns.example.com.ns. Of course it wouldn't find any entries for that name because there isn't one, nor is it supposed to. Here's my reverse file: $TTL 86400 @ IN SOA ns.example.com root.example.com. ( 16071990 ; Serial 3600 ; Refresh 1800 ; Retry 604800 ; Expire 86400 ; Minimum TTL ) @ IN NS ns.example.com It's not extraordinarily complicated. What my question is, what other files affect the output for named-checkzone when checking a name against the revers file?

    Read the article

  • Archiving old, outdated hard drives

    - by Calvin
    I have a number of old hard drives. I've decided to throw them out. But before I do that, I'd like to keep the contents of hard drives, intact. I tried to use the ISO file format to archive but the major problem is that it loses file attributes and can't create directories with exceeding depth of 8 levels. I do have drives over a variety of file systems; FAT, NTFS, ext2, ext3 and HFS and I'd like to archive them without any loss of information.

    Read the article

  • Alternative to SecondCopy?

    - by overtherainbow
    Hello I've been using SecondCopy (7.0.0.146) on XP for a few years now to copy important files from one hard-disk to another. One thing that bothers me is that it is unable to copy some files that are open. I assume Windows provides an API that allows an application to put an exclusive lock on an open file and backup utilities like SecondCopy just can't access them until they are closed. As a result, since I have to close a bunch of files/applications for SecondCopy to complete successfully, I typically don't run SecondCopy regularly like I should... which pretty much beats the whole purpose of backing up data :-/ For those of you using a similar solution to back up your important file onto a second mass storage solution... Can you confirm that an open file can be set off-limit with an exclusive lock, and no backup solution will work with those? If you've tried SecondCopy and other solutions recently and you ended up using another solution, which one did you choose and why? Thank you for any feedback.

    Read the article

  • Cron Permission Denied

    - by worldthreat
    good day, I have a bash script in my home directory that works properly from the command line (file structure is default media temple DV. < noted for certain permission issues) but receive this error from cron: "/home/myFile.sh: line 2: /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql: Permission denied" NOTICE: it's just line 2... it writes to the local server just fine. Below is the Bash File: #!/bin/bash mysqldump -uUSER -pPASSWORD -hHOST dbName> /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql mysql -uadmin -pPASSWORD -hlocalhost dbName< /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql can't chmod from bash (lol, yeah i tried). writing the file there and setting the permissions before the transfer is useless... i have googled the heck out of this situation and this one still seems unique.... any insight is appreciated

    Read the article

  • Ubuntu dpkg error , after crash and filesystem error recovery

    - by Radian
    Ubuntu recently crashed , causing it's partition damaged ( which is EXT4) and Ubuntu was unable to boot , because it couldn't mount anything , only displays Busybox So I used the Live CD to run fsck on the partition, which fixed it , but deleted some nodes Now Ubuntu is working , but some files were missing , for example I lost the Panels configurations and Chromium's Extensions The Most Annoying problem , that there is some files corrupted , for example when I try to install any program, I got this (Reading database ... 95%dpkg: unrecoverable fatal error, aborting: files list file for package 'libservlet2.4-java' is missing final newline I tried these commands dpkg --configure -a apt-get -f install and from GUI , Synaptic Package Manager Fix Broken Packages So this file "libservlet2.4-java" Does anyone knows what it does ! and where it's location ? and how can I fix/get-correct-version-of it ? Also , is there any way I could tell Ubuntu to Check for ALL it's file , and if there is something corrupted it should recover it form the CD ?

    Read the article

  • How do you set the Scheduled Tasks "Run only if logged in" option from the command line?

    - by Paul K
    I want to create some tasks with the "Run only if logged in" option set, but using a batch file, as you can in the user interface version of the schtasks.exe tool. I'm doing this so I don't have to specify a password, either in the batch file, or when the batch file is run... I would use /ru System but I can't because the tool I'm automating (SyncToy 2.1) won't run with that account because I am hitting a network drive... Also, I noticed that Google Chrome sets up some tasks with this option set during installation, so I'm thinking there must be a way...

    Read the article

  • Extract data from specific range of cells in multiple worksheet in multiple files.

    - by Michele
    Extract data from specific range of cells(always the same cells) in multiple worksheet in multiple files. 1 file=1 day. I have 6 technicians each day of the week, Monday thru Friday. So, 5 files with 6 worksheets. I have entered specific info in specific cells of every work sheet. The range is constant(the same address in EVERY worksheet in every file.) So, I need a formula to extract and calculate the data in the given range and dump it into another spreadsheet. I can forward an example a file if it will help anyone to answer my question. Or more explanation if necessary is available upon request. JUST PLEASE SOMEBODY HELP ME!!!!! Thank you all in advance. Regards, Michele

    Read the article

  • Why is my drive so full on my Windows 2008 Server

    - by Zee Tee
    My server is Windows 2008 R2 Standard Server. I have a secondary SAS drive where all my website files are with the following properties: NTFS File System Allow files on this drive to have contents indexed in addition to file properties IS CHECKED Simple Layout Basic Type Healthy (Page File, Primary Partition) Status I have 3 folders on this drive: Folder 1: 4GB Folder 2: 2GB Folder 3: 20GB (These are the sizes of them when I click properties) But the drive says it only has 10GB left out of 65GB. Why? I'm trying to make more room on this drive.

    Read the article

  • How to create custom Live CD with upgraded linux-image kernel?

    - by ???
    I'm following this tutorial to customize a Live CD, http://www.debuntu.org/how-to-customize-your-ubuntu-live-cd However, it's failed to upgrade linux-image, I guess it's just can't update-grub. Any idea? EDIT I copied the custom/ directory into an ext4 image, now the linux-image-2.6.35-24 package is successfully upgrade. But as shown in the original CDROM, the vmlinuz-* and initrd.img-* file is moved out to /casper/vmlinuz and casper/initrd.lz. Well, I can move vmlinuz-2.6.34-24-generic to /casper/vmlinuz, but how to compress file initrd.img-2.6.35-24-generic in .lz format? (It looks like lzip, but it's not: ) # cd original-maverick/casper # lzip -d initrd.lz initrd.lz: bad magic number (file not in lzip format).

    Read the article

  • Deep recursion in WHM EasyApache software update causes out of memory

    - by Ernest
    I was trying to load some modules with EasyApache in a software update (WHM) cause I need to install Magento ecommerce. I did the first EasyApache update. However, one module I needed was not loaded. I loaded later but whenever I check Tomcat 5.5 in the profile builder I get: -- Begin opt 'Tomcat' -- -- Begin dryrun test 'Checking for v5' -- -- End dryrun test 'Checking for v5' -- -- Begin step 'Checking jdk' -- Deep recursion on subroutine "Cpanel::CPAN::Digest::MD5::File::_dir" at /usr/local/cpanel/Cpanel/CPAN/Digest/MD5/File.pm line 107. Out of memory! Out of memory! *** glibc detected *** realloc(): invalid next size: 0x09741188 *** Line 107 in question in the file.pm is the third one in this snippet: if(-d $full) { $hr->{ $short } = ''; _dir($full, $hr, $base, $type, $cc) or return; //line 107 } All my client sites are down and I don't know what to do to fix this.

    Read the article

  • Why did I fail to build ctags for vim?

    - by hugemeow
    I got the latest, unreleased version of the ctags source code from the svn repository using svn co https://ctags.svn.sourceforge.net/svnroot/ctags I ran ./configure, which failed with the following error: config.status: creating Makefile config.status: WARNING: 'Makefile.in' seems to ignore the --datarootdir setting config.status: error: cannot find input file: config.h.in [mirror@home ctags-5.7]$ echo $? 1 Then I created an empty file named config.h.in, and now ./configure succeed. configure: creating ./config.status config.status: creating Makefile config.status: WARNING: 'Makefile.in' seems to ignore the --datarootdir setting config.status: creating config.h [mirror@home ctags-5.7]$ echo $? 0 Running make still failed. [mirror@home ctags-5.7]$ make gcc -I. -I. -DHAVE_CONFIG_H -g -O2 -c args.c In file included from args.c:17: /usr/include/stdio.h:88: error: two or more data types in declaration specifiers make: *** [args.o] Error 1 Why did this not work? How do I build ctags from the svn repository?

    Read the article

  • Windows SBS 2008 to Windows Server 2012 migration

    - by StefanGrech
    I am in the process of upgrading my Windows SBS 2008 server running Exchange, Active Directory and as a File server to Windows Server 2012 essentials. Now I know that Windows Server 2012 essentials does not have exchange, thus I was looking to migrate the Active directory and the file server to Windows Server 2012 essentials, Then I would have a separate Virtual machine running Windows server 2012 standard with Exchange 2013. Now my question is, what should I do first? Migrate the AD and File server to Windows 2012 essentials and then after the migration is finished, I create a local move of the mailboxes from SBS 2008 to Windows Server 2012 standard running exchange 2013? or should this be the other way round?

    Read the article

  • Powershell overruling Perl binmode?

    - by hippietrail
    I have a Perl script which creates a binary file while scanning a very large text file. It outputs to STDOUT which I redirect in the commandline to a file. To optimize it I'm making changes then seeing how low it takes to run. On Linux for this I use the "time" command. On Windows the best way to time a program seemed to be to PowerShell's "measure-command". This seemed to work fine but I noticed the generated files were larger. On examination I found that the files generated from within PowerShell begin with a BOM and contain CRLF pairs! My Perl script has a "binmode STDOUT" directive and does work correctly in a normal dosbox. Is this a bug or misfeature in PowerShell or measure-command? Has it affected others creating binary files by means other than Perl? Googling hasn't turned anything up so far. I'm using Perl 5.12, PowerShell v1.0 and Windows XP.

    Read the article

  • Scriptable FTPS client able to send Keep Alive to control port?

    - by schultkl
    We need a FTP client that satisfies the following constraints: Windows Command-line scriptable, so we can automate it...sorry, FileZilla (?) FTPS, as it seems to perform better than SFTP The ability to send KeepAlive commands to the FTPS control port No passwords sent on the command line...sorry, curl Number 4, above, is critical: we have set KeepAlive in some other clients (e.g., CoreFTP LE) but we seem to have some routing equipment in the server environment which drops our connection when transferring a 7GB+ file. We have also set passive mode and "resume transfer" functionality seems currently broken with this secure file transport server...so we need to download the file in one go. What FTPS clients might meet our needs?

    Read the article

  • Windows 7 Ultimate x64 / Safari / Downloads disappear

    - by Dayton Brown
    Running Safari 4.0.5. When I download a file, it never appears in the download folder. If I pause the download I get the xxxxxx.xxx.download temp file to show up. When I then resume the download, it will finish, but will never rename the file to xxxxx.xxx. I've heard from some other posts that AV seems to have a hand in it, but I'm running clamwin av which doesn't have an in memory av process. What should I do?

    Read the article

  • Unable to restore from Shadow Copy due to long filename

    - by Spongeboy
    We have shadow copy enabled on our Windows SBS 2008 server. Attempting to restore a file from shadow copy gave the following error- The source file name(s) are larger than is supported by the file system. Try moving to a location which has a shorter path name, or try renaming to shorter name(s) before attempting this operation. The filename has 67 characters, and it's shadow copy path is 170 characters. These seem to be under the NTFS limits (260?). We tried- Copying to the shortest path possible (C:) Copying to the shortest path possible on both a client computer and the server itself Is it possible to rename files in a shadow copy, before doing the copy? Any idea why the error is appearing despite the filename size appearing to be within limits?

    Read the article

  • Have VIM jump to a ctag in an existing tab

    - by Adrian Petrescu
    I have ctags configured with my vim installation. My habit is to usually have all of the relevant files I'm working on open in tabs in vim all at once. The "problem" is that if I use Ctrl+] to jump to a ctag in a file I'm editing, it will replace the buffer in that tab, even though I have another tab already open containing that symbol. It would be much better if it just switched to that tab and jumped to the symbol there instead. This way I would always have a 1-to-1 tab-to-file ratio. I noticed in the Changenotes for the taglist.vim plugin (which I also use) has an entry that says 1. Added support for jumping to a tag/file in a new or existing tab from the taglist window (works only with Vim7 and above). However, I couldn't find anything in the documentation for Taglist (or Ctags) about how to actually do this. Can any vim gurus fill me in? Thanks!

    Read the article

  • Printer monitoring script (PowerShell)

    - by HannesFostie
    I am going to write a script of some sort to check event viewer in a windows server 2003 for all printjobs, and then write them to a comma delimited textfile like printername_floor_room.txt I am wondering what the best way is to do this realtime, and keep checking the event viewer constantly. Any caveats I need to be aware of? Thanks EDIT: Okay, so I will most likely go for PowerShell and use Get-EventLog and then edit the "table" data. Problems I'm having: if I were to save all this data to a text file, how do I get the data out of it? A comma-separated file I could work with, but this, I'm not really sure. And once that is sorted out, I'm still not sure how to keep the file updated more or less real-time. Can I make this service-like, without hogging up all resources? Run it every x seconds for example?

    Read the article

  • OpenVPN - Ubunut 10.04 - Client Can't Connect to Server - Linux Route Add Command Failed

    - by nicorellius
    I suppose this could be asked on Server Fault as well, but it is specific to the client so I thought I'd start here. I have keys for a OpenVPN server already in place. I have used these keys to connect already, but using a Windows XP machine. I started by building the client.conf file so that I could run: sudo openvpn --config client.conf And it seems correct but I still can't connect and get these errors and lines of output: Mon May 31 14:34:57 2010 ERROR: Linux route add command failed: external program exited with error status: 7 Mon May 31 14:34:57 2010 /sbin/route add -net 10.8.0.1 netmask 255.255.255.255 gw 10.8.0.17 SIOCADDRT: File exists Mon May 31 14:34:57 2010 ERROR: Linux route add command failed: external program exited with error status: 7 Mon May 31 14:34:57 2010 Initialization Sequence Completed I searched the net for forums and ideas and tried some file moving and renaming but still ended up in the same place.

    Read the article

  • Problem with installing sqlite3 module for python 2.6 on an ubuntu system

    - by Hoang
    I need to run the sqlite3 module on python 2.6 in an ubuntu system. How do I install this module for Python 2.6? Somehow I don't have this module, it raises the error: >>> import sqlite3 Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python2.6/sqlite3/__init__.py", line 24, in <module> from dbapi2 import * File "/usr/local/lib/python2.6/sqlite3/dbapi2.py", line 27, in <module> from _sqlite3 import * ImportError: No module named _sqlite3

    Read the article

  • Cannot save iTunes library

    - by ongle
    Ever since I began using Windows 7 x64, I have been getting an infrequent error when using iTunes which goes something like this: The iTunes Library file cannot be saved. You do not have enough access privileges for this operation. Up until today this seemed to have no bad side effect but today I got the error, exited iTunes and started it back up only to discover my entire library was empty (the .xml file seemed to have been initialized). While I can restore my library from backup, this is pretty frustrating. Enough so that I may move the library to a MAC and sync there. I have reset the permissions on the folder. I have excluded the file from anti-virus scan. Neither of these things helped.

    Read the article

  • Dovecot Virtual Users and Users Domain Mapping

    - by Stojko
    I have successfully compiled, configured and ran Dovecot with virtual users feature. Here's part of my /etc/dovecot.conf configuration file: mail_location = maildir:/home/%d/%n/Maildir auth default { mechanisms = plain login userdb passwd-file { args = /home/%d/etc/passwd } passdb passwd-file { args = /home/%d/etc/shadow } socket listen { master { path = /var/run/dovecot/auth-worker mode = 0600 } } } I faced one issue I can't resolve myself. Is there anyway to create users' domains mapping and provide username in mail_location? Examples: 1. currently I have /home/domain.com/user/Maildir 2. I'd like to have /home/USER/domain.com/user/Maildir Can I achieve this somehow? Greets, Stojko

    Read the article

< Previous Page | 991 992 993 994 995 996 997 998 999 1000 1001 1002  | Next Page >