Search Results

Search found 41882 results on 1676 pages for 'png files'.

Page 660/1676 | < Previous Page | 656 657 658 659 660 661 662 663 664 665 666 667  | Next Page >

  • mod_rewrite directory path to deeper directory

    - by DA.
    I don't usually work with LAMP and am a bit stumped getting a site working locally. The site is set up to be used via localhost: 1) http://localhost/mysite However, the way the site files are physically on the server the root is located as such: 2) /var/www/mysite/trunk/site I'm trying to figure out a way where I could type #1 but have apache actually looking for the files in #2 so that all of the asset/page links in the web application work. Is mod_rewrite the solution? If so, I'm stumped on the syntax. I have this but it won't work (due, I assume, to it causing an infinite loop) RewriteRule ^mysite/ mysite/trunk/site I have a hunch I need to sprinkle on some regex?

    Read the article

  • i accidentally deleted the recovery folder on a partition (win vista home)

    - by paul
    i accidentally deleted the recovery folder on the recovery partition (win vista home) i think it was some sort of scheduled maintenance of some program that i did not configure properly? oops... lol i called toshiba and they said i needed to buy a recovery program, which i didnt bother doing. I bought a legal copy of vista and would like to install the correct files and in a way that when my computer starts looking for files it will eventually find them or i can point to the partition. i'm pretty sure it's not a matter of copy and paste (is it?) thanks Paul

    Read the article

  • Unable to set NTFS permissions for ApplicationPoolIdentity on Windows 2008 SP2

    - by Kev
    On Windows 2008 R2 I am able to set NTFS permissions for an application pool's synthesised ApplicationPoolIdentity account thus: ICACLS d:\websites\site1\www /grant "IIS AppPool\site1":(CI)(OI)(M) The website's application pool is named site1 and is configured to run as ApplicationPoolIdentity. The site's authentication is also configured to authenticate as ApplicationPoolIdentity. I've done this a thousand times on Windows 2008 Standard Edition R2 with never a hitch. However if I try to do the same in Windows 2008 Standard Edition SP2 I get the error: IIS AppPool\site1: No mapping between account names and security IDs was done. Successfully processed 0 files; Failed processing 1 files I also notice that this fails if I try to set permissions for the application pool identity via the security GUI as well. I've seen this before and a reboot has cleared this issue but I'd like to know why this happens periodically. Googling around suggests other folks have hit this problem but there's never a satisfactory explanation. Why would this be?

    Read the article

  • Backup virtual hard disk

    - by Harshil Sharma
    I have a VM created in VMWare Player. It's VHD is currently sized 17 GB, split among multiple 2 GB files. The host OS is Windows 8. I use CrashPlan in host OS for file backup. The problem is, whenever I use the VM, CrashPlan detects all parth of VHD as altered and backs up the 17 GB VHD. WHat I want is a software that can run on host OS (Windows 8), treat the VHD as a physical hard disk and create incremental backups of the VHD, includeing all files, programs and the OS

    Read the article

  • Looking for internet traffic manager software for Windows 7

    - by Semyon Perepelitsa
    I have 128 KB/s (1 Mbit/s) internet connection. Not quite fast for downloading big files, but it is okay for internet browsing. When I start big file dowload, I cannot surf the web normally: pages appears slowly, especially pictures there. Therefore I pause downloading, but I'm regularly spending much time on reading articles or leaving the computer alone, when I can allow the file being dowloaded. Is there any software that can automize the process of pausing and resuming download or simply automatically regulate internet speed for each application using Internet? I download files using different software and protocols (Download Master for http, ftp; uTorrent for torrents; many other programs for updating themselves), so I don't want to be tied to particular program for downloading.

    Read the article

  • Desktop stops working in Windows 7

    - by Roger
    I'm having an odd problem on one of my Windows 7 machines. At machine startup, explorer.exe is working properly - I can use the Start menu, click on desktop icons, etc. At some later point, though, I lose the ability to interact with desktop icons. I can't double-click to open files on the desktop, and I can't right-click to see the properties window. Oddly enough the Start menu still works properly, and I can still open an Explorer window into my desktop and access everything. I can get around the problem by killing explorer.exe and restarting it using Task Manager, but it's annoying to have to do this over and over. Does anyone have any ideas? I took a look at startup files with msconfig and didn't see any obvious problems. Any suggestions would be appreciated.

    Read the article

  • Does ZFS cache Compressed or Uncompressed data in a ZFS file-system with compression turned on?

    - by George Bailey
    ZFS supports file-system compression and it also caches frequently or recently accessed data. If a system has lots of CPU but the underlying data storage system is slow. It is possible that ZFS would perform better with compression turned on. This can be easily tested when writing files by measuring CPU and disk usage and throughput. (of course latency may exist,, but this would not be an issue for large files). But what about cache? If data will have to be decompressed every time it is read then this is probably less of a good idea. Is the cached data compressed?. Does anybody have some information on this?

    Read the article

  • Apache Tomcat 6.0.29 startup causing Win XP SP3 reboot

    - by Liam
    Some of our Windows XP Developer machines die (black screen - no ping) when starting up Tomcat from a console. This only happens if the Tomcat debug console window is displayed (not minimised). When minimised, the startup / run is successful. I've checked for Tomcat logs and the crash sessions produce empty (zero bytes) tomcat log files. No system logs exist and no crash dump is created even though this is enabled in the Windows settings. The machine is dual screen and another machine with the same image has the same issue. The only clue is just before the crash, one of the monitors the system reports "Cannot display video mode 1280x1024" (approximate error, didn't write it down). Is there a way I can get Windows to produce log files of this crash session? Beyond running Tomcat as a service / minimised window, how can I fix (or even investigate)?

    Read the article

  • How to delete vssver2.scc and global.asax from my Windows Server?

    - by rlb.usa
    I have a Windows server that I SFTP into, and I have some very old vssver2.scc files on there. They are used by Visual Source Safe- which is no longer used (SVN instead now). I want to delete them. Most troubling though is a very old global.asax file used by ASP.NET applications - since the app is compiled, it reads from it's global.dll in the Bin folder, and not the global.asax. I want to get rid of it. But I can't - and I can't overwrite it in favor of a newer one either. These files have 444 (Owner:r Group:r Public:r) permission and when I try to give them 777 (O:rwx G:rwx P:rwx) permission hoping it will let me delete them, it goes back to 444.

    Read the article

  • Copy & Paste Images Wiki functionality?

    - by Jakub
    I was discussing this with some coworkers, they would like a wiki, but are turned off by the need to constantly [browse] and [upload] files. They like the functionality of Lotus Notes that they can copy & paste screenshots (or crops etc) directly into Notes databases / libraries. But they are not fans of Lotus notes behond that (performance, etc,) Anyone work with a free (open source hopefully) wiki of documenting application that just allows copy & paste of text & image content? Would be great to have one internally installed that we could use for documentation without constantly uploading/attaching files. Not sure if I am asking for functionality that is not available in a browser.

    Read the article

  • transfer code from one server to other server.

    - by Kamlesh Bhure
    I wanted to transfer new code into my new production server. I have code files on my development server. Instead of uploading files using FTP from my local machine, there is other way to transfer code from one server to other. What I am thinking I will make zip file of whole code to be transfer and place it in webroot. So that it would be accessible in internet with some link http://www.mydomain.com/code.tar.gz now on the other server i will just run command wget http://www.mydomain.com/code.tar.gz Will this transfer done in few seconds...? May I know is this correct approach?

    Read the article

  • FTP server (vsftpd) with webgui

    - by manutenfruits
    I want to build a file server to make users able to upload and download mostly multimedia, but also common files. Right now I have an Arch installation with vsftpd and I'm about to install miniDLNA for multimedia sharing. The only problem is that FTP doesn't seem to fit my needs, because almost always makes the users need a client such as FileZilla to make the server friendly. I have been looking for a web frontend for vsftp but apart from management interfaces there's nothing. I need a frontend accessible from a browser through which users can navigate throught the folders in an easier and more elegant way than the plain FTP display that browsers make by default. It should be able to let users upload files and, as an awesome extra, let them play the multimedia directly on the browser. For this, I am willing to dump FTP if needed, I've heard about HTTP File Servers but don't know too much about it. I could code everything myself, but there's gotta be something out there already.

    Read the article

  • Ubuntu can't install an older version of a package

    - by Trevor Newhook
    When I try to do an apt-get install, I keep getting an error: Depends: libgtk-3-common (= 3.4.1-0ubuntu1) but 3.4.2-0ubuntu0.4 is to be installed when I run sudo apt-get -f install, I get several dpkg: warning: files list file for package 'XXX' missing, assuming package has no files currently installed. then Preparing to replace libgtk-3-bin 3.4.1-0ubuntu1 (using .../libgtk-3-bin_3.4.2-0ubuntu0.4_i386.deb) ... Adding 'diversion of /usr/sbin/update-icon-caches to /usr/sbin/update-icon-caches.gtk2 by libgtk-3-bin' dpkg-divert: error: rename involves overwriting `/usr/sbin/update-icon-caches.gtk2' with different file `/usr/sbin/update-icon-caches', not allowed dpkg: error processing /var/cache/apt/archives/libgtk-3-bin_3.4.2-0ubuntu0.4_i386.deb (--unpack): subprocess new pre-installation script returned error exit status 2 Errors were encountered while processing: /var/cache/apt/archives/libgtk-3-bin_3.4.2-0ubuntu0.4_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) I'm not sure why it's complaining about a newer version of a package, but any help would be appreciated

    Read the article

  • lamp server permissions on development server

    - by user101289
    I run a LAMP server on a ubuntu laptop I use only for development. I am not greatly concerned with security, since the server is never accessible outside the local network, and it's turned off when I'm not using it. My question is what is the simplest and 'best' way to set permissions/users/groups so that when my myself user creates, edits or writes files in the webroot, I won't need to go through and CHMOD / CHOWN everything back to the www-data user? Should I add myself to the www-data group? Or chown the webroot to www-data:myself? Or is there a best practice for this situation so I don't have to keep re-setting the ownership of these files? Thanks

    Read the article

  • Is it secure to store the cert/key on a private AMI?

    - by Phillip Oldham
    Are there any major security implications to bundling a private AMI which contains the private key/certificate & environment variables? For resiliency I'm creating an EC2 image which should be able to boot and configure itself without any intervention. After boot it will attempt to: Attach & mount specific EBS volume(s) Associate a specific Elastic IP Start issuing backups of the EBS volume(s) to S3 However, to do this it will need the private key/pem files and will need certain environment variables to be available on start-up. Since this is a private AMI I'm wondering if it will be "safe" to store these variables/files directly in the image so that I don't need to specify any user-data information and can therefore start a new instance remotely (from my iPhone, if needed) should the instance be terminated for any reason.

    Read the article

  • Access Denied on LAN IIS Access via Integrated Authentication

    - by Pharao2k
    I have an IIS 7.5 (Win2k8R2) Webserver, which publishes an UNC Share (on a Fileserver) with restricted access. The AppPool Identity is a Domain User-Account with read access to mentioned UNC path. Authentication modes are set to Anonymous and Integration Authentication. When I access the path via localhost from the Webserver itself, it works, but if I try the Hostname or IP from either the Webserver or a Client, I get three authentication prompts (does not accept my credentials) and a 401.3 Unauthorized error message (but it states that I am logged in as my normal credentials which definitely have access rights to the UNC path and its files). Security Zone is set to Local Intranet. Sysiniternals Process Monitor lists CreateFile operations on the UNC path (and other existing files in it) with Access Denied and Impersonating on the correct credentials. I don't understand why it is not working, it seems to use the correct credentials on every step on the way but fails with is operations.

    Read the article

  • First request too slow even if I have a load balancer in the back

    - by adrian7
    I have an Apache 2 on Centos + bind with a wordpress website on it (e.g example.com). I have also set up, on another server in a different contry a load balancer (varnish:80 + nginx 127.0.0.1:8080) for it - which task is to server all static content under /wp-content/. Using Simple DNS editor I added an A entry to cdn.example.com pointing to the server's IP. So no extra work from a 2nd dns server. Then using htaccess I redirect all requests to jpg|gif|css|js files to cdn.example.com. That works and all files are saved on the "cdn" server and served right away. My problem is that for the first time I enter on example.com (e.g after restarting the computer or closing the browser) the load time is 1 up to 3 seconds, while any subsequent page loads take only 300 to 600 miliseconds. I know it might be a DNS issue, but I have done a cache check on several websites and cdn.example.com indicates the right IP. Do you have any ideas where I should dig to solve this first-time slowness?

    Read the article

  • Corrupted file, hard drive test?

    - by all-R
    Hi guys, I'm currently on a macbook with a 1TB external hard drive connected trough a USB hub wich is connected on my macbook. The problem is, my disk, wich is partitioned in 2 (one HFS+ and one NTFS) keeps getting corrupted, recently it was my HFS+ partition, I could not repair it using the Apple's Disk utility, but was able to backup my files. Is it synonym that my hard drive is failing? Is it because of my USB hub? I also keep all my iTunes library on my external HD (HFS+ partition), and did a lot of transfer lately, adding files, removing etc. the last time, my partition got corrupted after a lot of deleted items. If anybody has an idea of what to check first, what could cause the problem, I would appreciate it :) Thanks!

    Read the article

  • Help on using mod_rewrite to serve I18N static site

    - by Guandalino
    My static site www.example.com is translated in different languages and files are organized in this hierarchy: / /de index.html seite-1.html /en index.html page-1.html /it index.html pagina-1.html The root contains no files, just one subdirectory for each language the site is translated in, while subdirectories contain pages translated (both content and file name are) in the language corresponding to subdirectory name, de, en, it, etc. The question is: how to configure mod_rewrite so that when a client visits www.example.com it is taken to the correct version of the site, falling back to english version if the required locale is not supported (i.e. Accept-Language header doesn't exist or specifies a language for which the site is not available, e.g. fr)? Thanks for any pointer, I'm here to provide further details or feedback! Best regards

    Read the article

  • Alternatives to Crashplan for VPS?

    - by Chloe
    I use SFTP Net Drive to mount a remote VPS so I can back it up. However, it's taken over 3+ days to scan! I ran 'ls -lR' from my desktop over the mounted network drive and it only took about 5m to list all the files! There are only about 5000 files and 2 GB. I know Crashplan can run headless on the VPS itself, but that sounds like a pain to set up, and it takes so much memory on the server. The VPS doesn't have a lot of memory to spare - it's less than my desktop. Is there another program that can communicate with a Crashplan backup protocol and has a command line interface? backup /home

    Read the article

  • Is there a way to make 7zip temporarly uncompress the whole archive when double-clicking on an exe?

    - by Gnoupi
    In WinRAR, one feature which I like is the fact that you can set it to uncompress the whole archive in a temporary place, if you double-click on an .exe file inside the archive opened in WinRAR. Typically, I often download small games, which I just want to try, without the hassle of creating a folder for it, etc. Same for archives containing an installer with its own separate files. In the 7-zip window, if I double-click an exe, it will just extract the exe in a temporary location and launch it. In the small game context (or installer), it means that it will simply fail, because it will miss required files in the same folder. So my question is: Is there a way to make 7-zip extract the whole archive in a temporary folder when launching an exe from inside the archive?

    Read the article

  • Setting up scripts in Amazon EC2 Cloud

    - by racket99
    Hello, I am currently running a few perl and python scripts on a windows pc and would like to port over to the Amazon EC2 servers running 64-bit LINUX. The scripts are basic web scrapers that go to a variety of websites, get data and then save daily as csv files. I would like to install these in the cloud and get them running in an automated way so that they will run without my intervention. Also given that I don't want to lose all the data if the instance crashes, I should also upload the csv files to Amazon S3. Any idea how I can do this? I am not terribly versed in LINUX nor do I know Perl/Python well. What is the best way for me to tackle thi

    Read the article

  • SugarCRM CE Won't Install on Ubuntu 10.10

    - by Trenton Scott
    I have a fresh copy of Ubuntu 10.10 server with a working LAMP installation. I downloaded SugarCRM and browsed to its directory to open the installer (via Firefox). The installer appears fine, I accept the license agreement, and it proceeds to check file permissions. It advises that several directories need looser permissions (chmod 766), and I adjust them accordingly. After making the changes, I click "recheck" and the page just reloads as blank (white). There are no errors visible, nothing in the server logs (Apache/PHP) and installation cannot continue. I'm able to get back to the installation tool by readjusting permissions back to my default (0755 for directories, 0644 for files). All files/folders are owned by root and the www-data group. Any idea about what's wrong?

    Read the article

< Previous Page | 656 657 658 659 660 661 662 663 664 665 666 667  | Next Page >