Search Results

Search found 17260 results on 691 pages for 'folder tree'.

Page 507/691 | < Previous Page | 503 504 505 506 507 508 509 510 511 512 513 514  | Next Page >

  • remove apache tar binary package in centos

    - by user119720
    I need help in removing binary package that had been installed in my linux machine. The scenario that I am having are such as like this: I've already install latest apache that I get from its website(httpd.apache.org) through Unix binary package(tar.gz) After successfully install program,the apache server web perfectly without any issues. But then I having a thought,if there are another latest release version of apache in the future,then I need to make sure that i can upgrading current apache or reinstalling the new version apache. So my question is, how do I ensure that I have remove the old remove apache and all its dependencies so that it will not having conflict(probably) when installing the new apache. Right now the only thing that i can think of is to remove all the apache folder manually : rm -rf /apache2 Hope someone can shed some light about this.Thanks.

    Read the article

  • Dynamic Monitoring Service (DMS) Configuration Dumping and CPU Utilization

    - by ShawnBailey
    There was recently a report of CPU spikes on a system that were occuring at precise 3 hour intervals. Research revealed that the spikes were the result of the Dynamic Monitoring Service generating a metrics dump and writing it under the server 'logs' folder for every WLS server in the domain. This blog provides some information on what this is for and how to control it. The Dynamic Monitoring Service is a facility in FMw (JRF to be more precise) that collects runtime data on the components deployed to WebLogic. Each component is responsible for how much or how little they use the service and SOA collects a fair amount of information. To view what is collected on any running server you can use the following URL, http://host:port/dms/Spy and login with admin credentials. DMS is essentially always running and collecting this information in the runtime and to protect against loss of this data it also runs automatic backups, by default at the 3 hour interval mentioned above. Most of the management options for DMS are exposed through WLST but these settings are not so we must open the dms_config.xml file which can be found in DOMAIN_HOME/config/fmwconfig/servers/<server_name>/dms_config.xml. The contents are fairly short and at the bottom you will find the following entry: <dumpConfiguration>     <dump intervalSeconds="10800" maxSizeMBytes="75" enabled="true"/> </dumpConfiguration> The interval of 10800 seconds corresponds to the 3 hours and the maximum size is 75MB. The file is written as an archive to DOMAIN_HOME/servers/<server_name>/logs/metrics. This archive contains the dump in XML format. You can disable the dumps all together by simply setting the 'enabled' value to 'false' or of course you could modify the other parameters to suit your needs. Disabling the dumps will NOT impact DMS collections or display at runtime. It will only eliminate these periodic backups.

    Read the article

  • Run Win7 Guest (raw disk) in Ubuntu (which was installed as Dual Boot on existing Win7)

    - by kingdango
    I installed Ubuntu 12.10 on top of Win 7 as a dual boot (awesome!). I'm hoping to use VirtualBox to run my original Win7 instance as a guest OS under Ubuntu. I found this existing question and followed the directions to no avail. I can get the VMDK file created but when I run it I just get a blank black screen with no additional information and Windows never loads. I see no HD activity or anything that would indicate it's loading. I used this command to create the VMDK file: VBoxManager internalcommands createrawvmdk -filename ~/.VirtualBox/Win7Native.vmdk -rawdisk /dev/sda3 It looks like everything was created correctly but I just get a blank screen when I run the VM. I do get this warning when I boot the VM: VirtualBox - Warning The virtual machine execution may run into and error condition as described below... The medium '/home/XXX/.VirtualBox/Win7Native.vmdk' has a logical size of 583GB but the file system the medium is located on can only handle up to 16GB in theory. We strongly recommend to put all your virtual disk images and the snapshot folder on a proper file system (e.g. etc3) with a sufficient size. ErrorId: Fat Partition Detected Severity: Warning How can I get this working?

    Read the article

  • WUBI installation can no longer boot, UUID disk not found

    - by Joel Heenan
    Yesterday my Wubi/Vista installation was working fine. I shut it down at the train station, all good then when I attempted to boot ubuntu at home I got a message saying the UUID for root could not be found. By booting with the Ubuntu live CD I found that the C:\ubuntu\disks folder stat structure was broken, reported as "??? ??? ??" kinda thing. I booted into Windows, scheduled a CHKDSK, ran that on boot which found some errors and rebooted. Still no dice. I am not stressed because it appears my home directory is still there with all my content so I don't mind re-installing the OS (probably will clean it up some). What is the best path from here to repair the WUBI installation? Is there anything else I should do to repair it? I'm looking at whether the drive is dying now to work out why this occured. Possibly I moved the laptop before shutdown had completed.

    Read the article

  • How to allow Mac OS X's native Apache/PHP installation to access WebServer directories?

    - by Martin Bean
    I have a problem bugging me with Mac OS X's native Apache/PHP installation. With my PHP scripts, I have to alter the file permissions on each folder I want to access. For example, in an upload script I would have to set the destination directory to 'read & write' for the group 'everyone'. However, I believe this is not the best practice and would like all of my directories to be readily writable to PHP. My scripts are stored in /Library/WebServer/Documents/, which is Mac OS X's default directory to serve web pages locally.

    Read the article

  • How to install a new PHP extension on IIS 7.5

    - by Razor
    I have installed PHP 5.2.13 through Web Platform Installer on IIS 7.5/windows 2008 R2 Now I'm trying to install additional PHP extensions, such as mcrypt, so I downloaded the compiled .zip file from the php windows binaries site. I put the relevant dlls in the extensions folder, and added the lines in the php.ini: [PHP_MCRYPT] extension=php_mcrypt.dll After restarting IIS, no application was working. This was with VC6 compiled/non thread-safe compiled version of PHP 5.2.13 .zip package. I also tried with the thread-safe version, which prevented IIS to restart and forced me to reboot the server. What am I doing wrong here?

    Read the article

  • install clamav from surceforge

    - by maria
    I'm using Ubuntu 10.04. I'm affraid I've installed a virus on my computer (I've clicked a link which was a virus), I'd prefere to check if everything is fine. I've installed clamAV using Synaptic. The installed version was 0.96.5, while the most recent version is 0.97.6. When I was trying to update virus database, I got a warning that the version of clamAV I'm using is outdated. Since I didn't know how to update the software by Synaptic or apt (both show the installed version as the newest) I've uninstalled all and downloaded the recent version from sourceforge. I've unpacked the tar.gz archive, entered the folder but when I type ./configure I'm getting the message: configure: error: Please install zlib and zlib-devel packages When I type sudo aptitude install zlib zlib-devel The terminal output says there is no such packages (there is no package zlib-devel and there is many packages which name contain zlib). I suppose the link contained probably the Windows virus, but I'd prefere to make sure there is nothing in my computer. As well as I'm not sure if the virus, even without harming my system, can send itself to my e-mail contacts or not.

    Read the article

  • Using .htaccess, can you hide the true URL?

    - by Richard Muthwill
    So I have a web hotel with 1 main website http://www.myrootsite.com/ and a few websites in subdirectories, in a folder called projects. I have domain names pointing to the subdirectories, but when holding the mouse over a link in those websites the URLs are shown as: http://www.myrootsite.com/projects/mysubsite/contact.html When I'm on mysubsite.com I want them to be shown as: http://www.mysubsite.com/contact.html I spoke to support for the web hotel and the guy said try using .htaccess, but I'm not sure exactly how to do this. Thank you very much for your time! Edit: For more information My website is: http://www.example1.com/ and I also own http://www.example2.com/. All of example2.com's files are in: example1.com/projects/example2/. When you visit example2.com, you'll notice all of the URL's point towards: example1.com/projects/example2/ but I want them to point towards: example2.com/ Can this be done? I hope this is enough info for you to go on :). Edit: For w3d I go to the url mysubsite.com and the browser shows the url mysubsite.com. The services I'm using create an iframe around myrootsite.com and use the url mysubsite.com I just hate that in Firefox and Internet Explorer, holding the mouse over link show that the destination url is: myrootsite.com/projects/mysubsite/...

    Read the article

  • Windows 7 'All Programs' folders self-close on right-click

    - by Madmanguruman
    Odd issue on my Windows 7 Professional (32-bit) system. If I click on the Start 'orb' then navigate to 'All Programs', then navigate to any of the folders that appear at the bottom of the list (like Accessories) and left-click, the folder contents expand with no issue. If I right-click, the context menu appears for half-a-second or so, then the popup goes away and the entire start menu dismisses. I'm not sure how to debug this issue - I'm considering using Autoruns to try and disable things hooked into the shell one-by-one. Is there a way to use a tool like Process Explorer to narrow down the process that's actually dismissing the menus?

    Read the article

  • Find hosted directories/ports in Jetty/Apache

    - by Paul Creasey
    Hi, I first asked this on SO, but i didn't get a response and i think it is probably more appropriate here. Let say I have a directory which is being hosted by Jetty or Apache (i'd like an answer for both), i know the URL including the port and i can log into the server. How can i find the directory that is being hosted by a certain port? I'd also like to go the other way, i have a folder on the server, which i know if being hosted, but i don't know the port so i can't find it in a web browser. How can i find a list of directories that are being hosted? This has been bugging me for ages but i've never bothered to ask before! Thanks.

    Read the article

  • .htaccess error with css

    - by user66161
    Hey Guys, I really need your help with writing seo url. I'm new to apache, mod rewrite and .htaccess and after a week without success. I want to change: sub.domain.com/soccer/teams.php?name=tigers to sub.domain.com/soccer/tigers What should my link (tigers) be? how would i set this that it doesn't cause a .css|.jpg|.png errors. My .htaccess file is located in /soccer/ folder. Please help or direct me to where i can fine help.

    Read the article

  • Upgrade manually-installed msi by assigning software through gpo

    - by Mr Happy
    In the past I rolled out software by manually installing it on a "golden" workstation, creating an (ghost)image from that and rolling that out to the other workstations. I try not to do that any more for simple/small software, and when possible (if it's an .msi) I assign the software through gpo. I'm having a problem with one of those. The software was manually installed on the image, which was rolled out, and now I have an update for that software (new msi) and I'd like to assign it through gpo. Don't know if it's relevant but it's user assigned. The new version gets installed alongside with the old version (this is possible since the program folder is different between those versions). When I install the same msi by hand, it properly removes/upgrades the old version though. Is what I am trying to do possible?

    Read the article

  • Unable to list contents/remove directory (linux ext3)

    - by RedKrieg
    System is CentOS5 x86_64, completely up to date. I've got a folder that can't be listed (ls just hangs, eating memory until it is killed). The directory size is nearly 500k: root@server [/home/user/public_html/domain.com/wp-content/uploads/2010/03]# stat . File: `.' Size: 458752 Blocks: 904 IO Block: 4096 directory Device: 812h/2066d Inode: 44499071 Links: 2 Access: (0755/drwxr-xr-x) Uid: ( 3292/ user) Gid: ( 3287/ user) Access: 2012-06-29 17:31:47.000000000 -0400 Modify: 2012-10-23 14:41:58.000000000 -0400 Change: 2012-10-23 14:41:58.000000000 -0400 I can see the file names if I use ls -1f, but it just repeats the same 48 files ad infinitum, all of which have non-ascii characters somewhere in the file name: La-critic\363-al-servicio-la-privacidad-300x160.jpg When I try to access the files (say to copy them or remove them) I get messages like the following: lstat("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Sebast\355an-Pi\361era-el-balc\363n-150x120.jpg", 0x7fff364c52c0) = -1 ENOENT (No such file or directory) I tried altering the code found on this man page and modified the code to call unlink for each file. I get the same ENOENT error from the unlink call: unlink("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Marca-naci\363n-Madrid-150x120.jpg") = -1 ENOENT (No such file or directory) I also straced a "touch", grabbed the syscalls it makes and replicated them, then tried to unlink the resulting file by name. This works fine, but the folder still contains an entry by the same name after the operation completes and the program runs for an arbitrarily long time (strace output ended up at 20GB after 5 minutes and I stopped the process). I'm stumped on this one, I'd really prefer not to have to take this production machine (hundreds of customers) offline to fsck the filesystem, but I'm leaning toward that being the only option at this point. If anyone's had success using other methods for removing files (by inode number, I can get those with the getdents code) I'd love to hear them. (Yes, I've tried find . -inum <inode> -exec rm -fv {} \; and it still has the problem with unlink returning ENOENT) For those interested, here's the diff between that man page's code and mine. I didn't bother with error checking on mallocs, etc because I'm lazy and this is a one-off: root@server [~]# diff -u listdir-orig.c listdir.c --- listdir-orig.c 2012-10-23 15:10:02.000000000 -0400 +++ listdir.c 2012-10-23 14:59:47.000000000 -0400 @@ -6,6 +6,7 @@ #include <stdlib.h> #include <sys/stat.h> #include <sys/syscall.h> +#include <string.h> #define handle_error(msg) \ do { perror(msg); exit(EXIT_FAILURE); } while (0) @@ -17,7 +18,7 @@ char d_name[]; }; -#define BUF_SIZE 1024 +#define BUF_SIZE 1024*1024*5 int main(int argc, char *argv[]) { @@ -26,11 +27,16 @@ struct linux_dirent *d; int bpos; char d_type; + int deleted; + int file_descriptor; fd = open(argc > 1 ? argv[1] : ".", O_RDONLY | O_DIRECTORY); if (fd == -1) handle_error("open"); + char* full_path; + char* fd_path; + for ( ; ; ) { nread = syscall(SYS_getdents, fd, buf, BUF_SIZE); if (nread == -1) @@ -55,7 +61,24 @@ printf("%4d %10lld %s\n", d->d_reclen, (long long) d->d_off, (char *) d->d_name); bpos += d->d_reclen; + if ( d_type == DT_REG ) + { + full_path = malloc(strlen((char *) d->d_name) + strlen(argv[1]) + 2); //One for the /, one for the \0 + strcpy(full_path, argv[1]); + strcat(full_path, (char *) d->d_name); + + //We're going to try to "touch" the file. + //file_descriptor = open(full_path, O_WRONLY|O_CREAT|O_NOCTTY|O_NONBLOCK, 0666); + //fd_path = malloc(32); //Lazy, only really needs 16 + //sprintf(fd_path, "/proc/self/fd/%d", file_descriptor); + //utimes(fd_path, NULL); + //close(file_descriptor); + deleted = unlink(full_path); + if ( deleted == -1 ) printf("Error unlinking file\n"); + break; //Break on first try + } } + break; //Break on first try } exit(EXIT_SUCCESS);

    Read the article

  • Windows Vista Backups?

    - by skaz
    I am trying to configure a Windows Backup on Vista but don't see some capabilities I would expect to be there. For one, it looks like I can only select a Local Drive or a network share. I want to use a local drive, but I want to use a sub folder of one of the drives. Must I really pick the root? As a work-around, I made a network share to the local drive, thinking I could then pick network share. However, when I do this, I am prompted for credentials to hit the share, and none work. However, the share works Explorer, and it works from other computers, so the access is configured correctly. Is there any way to do what I am trying to do? Thanks.

    Read the article

  • Making document storage in Sharepoint a breeze (leave the Web UI behind)

    - by deadlydog
    Hey everyone, I know many of us regularly use Sharepoint for document storage in order to make documents available to several people, have it version controlled, etc.  Doing this through the Web UI can be a real headache, especially when you have multiple documents you want to modify or upload, or when IE isn’t your default browser.  Luckily we can access the Sharepoint library like a regular network drive if we like. Open Sharepoint in Internet Explorer (other browsers don’t support the Open with Explorer functionality), navigate to wherever your documents are stored, choose the Library tab, and then click Open with Explorer. This will open the document storage in Explorer and you can interact with the documents just like they were on any other network drive J  This makes uploading large numbers of documents or directory structures super easy (a simple copy-paste), and modifying your files nice and easy. As an added bonus, you can drag and drop that location from the address bar in Explorer to the Favorites menu so that it’s always easily accessible and you can leave the Sharepoint Web UI behind completely for modifying your documents.  Just click on the new favorite to go straight to your documents.   You can even map this folder location as a network drive if you want to have it show up as another drive (e.g N: drive). I hope you found this as useful as I did

    Read the article

  • NFS users getting a laggy GUI expierence

    - by elzilrac
    I am setting up a system (ubuntu 12.04) that uses ldap, pam, and autofs to load users and their home folders from a remote server. One of the options for login is sitting down at the machine and starting a GUI session. Programs such as chormium (browser) that preform many read/write operations in the ~/.cache and ~/.config files are slowing down the GUI experience as well as putting strain of the NFS server that is causing other users to have problems. Ubuntu had the handy-dandy XDG_CONFIG_HOME and XDG_CACHE_HOME variables that can be set to change the default location of .cache and .config from the home folder to somewhere else. There are several places to set them, but most of them are not optimal. /etc/environment pros: will work across all shells cons: cannot use variables like $USER so that you can't make users have different new locations for .cache and .config. Every users' new location would be the same directory. /etc/bash.bashrc pros: $USER works, so you can place them in different folders cons: only gets run for bash compatible shells ~/.pam_environment pros: works regardless of shell cons: cannot use system variables (like $USER), has it's own syntax, and has to be created for every user

    Read the article

  • .htaccess redirect root directory and subpages with parameters

    - by wali
    I am having difficulty trying to redirect a root directory while at the same time redirect pages in a sub directory to a different URL. For example: http://test.example.com/olddir/sub/page.php?v=one to http://test.example.com/new/one while also redirecting the any request to the root of the olddir folder. I have tried RewriteCond %{QUERY_STRING} v=one RewriteRule ^/olddir/sub/page.php /new/? [R=301] and RedirectMatch /oldir "test.example.com" RedirectMatch /olddir/sub/page.php?v=one "test.example.com/new/one" Any help at this point will be extremely appreciated...Thanks!

    Read the article

  • Windows server response time very high

    - by Nagaraju Bandla
    Server Specs Windows Server 2008 R2 64 bit Provider : Fasthosts .Net Framework: 4.0 6 GB RAM (its using 4.6 GB) i have a website with thousands of pages structured like folderone/1/one to 500.aspx folderone/2/one to 500.aspx . . folderone/500/one to 500.aspx To load this pages for the first time after the release, for each folder it takes about 20 to 30 minutes and once one page is loaded the rest of the pages loads fine. This happens for all folders. And this repeats every time i restart the server, when a added anything to app_code or if i change the web.config. My site is mainly works Google and due to this problem its giving errors. Any help will be highly appreciated please. i am happy to buy a beer for you if its resolved. Thanks in advance...

    Read the article

  • rsync to ONLY keep files in destination that have been removed from source

    - by David Corley
    We use rsync to copy filesystem contents from one machine to another as a backup. We first run MACHINE-X-MACHINE-Y rsync for a straight backup with the --delete and --delete-excluded switches We also run an internal Rsync between the MACHINE-Y destination, and another folder on MACHINE-Y with either of the delete flags. This maintains a non-destructive copy in the event someone inadvertently deletes a file on MACHINE-X. However, it also has the overhead of being a complete copy of what has already been synchronized. Ideally I want to be able to run the non-destructive rsync in such a way that the destination ONLY receives the deleted files and so avoids unnecessary duplication . Is there any way to do this?

    Read the article

  • how to know who is accessing my system?

    - by calvin
    Is it possible to know if anyone is accessing any of folders or drives in my system(32 bit windows 2003)? I mean shared folders or non-shared folders, anything. And once if we know, how to deny access to particular host. For shared folders i know how to do, but if anyone is accessing some folder with proper credentials, i don't know how to control. Please ignore cases like bit torrent etc. All i wanted to know is if anyone is accessing my system folders in this way \\10.30.188.231\d$\calvin_docs with some valid username and password. I wanted to know ip/username of system who is accessing

    Read the article

  • Cygwin file and directory user and group

    - by dvanaria
    I use Cygwin as my main development environment on both my home and work computers. In order to share files between the two computers, I use Dropbox, which is installed in the following folder on both computers: c:\cygwin\home\dvanaria\dropbox Everything works great, except for one thing. When I'm working on my home computer and do an ls -l on any directory, all the files show up as owned by dvanaria of group Users. But when I work from my work computer, an ls -l shows all files as being owned by Administrators and of group Domain Users. I know Cygwin uses some kind of mapping between Windows users and permissions to the /etc/passwd file. But to be honest I have no idea how this file works or how it maps to Windows under Cygwin. Could anyone help figure this out? The main problem is that I can't edit any files when using my work computer, only read them.

    Read the article

  • Eliminating zero-length files

    - by RhZ
    I have been having multiple crashes recently. 4-5 last night within a few hours. I posted about it before, and got an answer but not sure how to proceed. The messages in my logs right before the crash are multiple complaints about valid eCryptfs headers. But the chron might not be related, I don't think I saw that in previous crashes: xxx-desktop kernel: [ 1112.274474] Valid eCryptfs headers not found in file header region or xattr region, inode 32376924 xxx-desktop CRON[4212]: (root) CMD ( cd / && run-parts --report /etc/cron.hourly) So I was sent to an answer providing this script: for i in find $(mount | grep " on $HOME type ecryptfs" | awk '{print $1}') -size 0c; do if ! fuser -v $i; then rm -f $i fi done I did find some zero byte files, not in the exactly right place (a folder called .private as I remember), but I need to fix this, its too bad right now. So I need to delete any of them that are not in use. I am a little too clueless, can someone walk me through executing this script? I don't know how.

    Read the article

  • gmail download by POP3 won't download all emails. How to reset all emails to not downloaded so that ALL will download?

    - by Rob
    I want to download emails from gmail using POP3 with Outlook Express. It downloads about 350 or so emails but doesn't download the remainder - there are over 2000 emails. The emails downloaded are not recent. I've tried disabling and re-enabling POP options in the settings in gmail itself but this doesn't fix the issue. Any ideas? Failing that I would use IMAP. I would try to then copy it locally on my machine to the standard POP Inbox folder in Outlook Express so that Express Archiver (a separate program) can then archive each email as a file with meaningful file names (e.g. subject, sender). I want to download email because I archive back it up with project work material it relates to, so it is all in one place.

    Read the article

  • Image archiving on network folders.

    - by Steve
    Our company uses Symantec Enterprise Vault to archive network folders and files, presumably to save on disk space. I can't see any other benefits. Our company is an architectural firm, and the problem our users face is locating particular images in network folders. Because all the files are archived, Windows Explorer is unable to generate image thumbnails. Each image needs to be individually restored from archive by double-click before it can be viewed. This is a big time-waster for our architects. Symantec say there is no workaround for this. Does anyone know of an alternative we could use for archiving images? Alternatively, some batch utility to create and maintain a thumbs.db file in each folder? Thanks.

    Read the article

  • How can I stop ntbackup requiring my new password every time I'm forced to change my Windows passwor

    - by Lunatik
    I have a scheduled job that runs each night using ntbackup which copies a folder on my HDD to a network share. The problem is that every time I'm required to change my Windows password I have to remember to change it in ntbackup aswell, otherwise the backup fails silently i.e. I get no warning that the backup isn't being done. Is there a way to schedule this job so it will automatically pick up my new Windows password, or somehow not be tied to my main login? My user account type is Debugger, not full Administrator, so I'm not sure if that would restrict me in any way, e.g. still forcing a four-weekly password change on a dedicated user account for this. The PC runs XP SP2 on a Windows Server 2003 R2 domain.

    Read the article

< Previous Page | 503 504 505 506 507 508 509 510 511 512 513 514  | Next Page >