Search Results

Search found 41882 results on 1676 pages for 'png files'.

Page 364/1676 | < Previous Page | 360 361 362 363 364 365 366 367 368 369 370 371  | Next Page >

  • peoblem with download file on zend framework

    - by user1400
    hello all i am using upload files to server in my application, it works fine , i want other users could download these file ,but i get error i created a upload folder in public folder and i upload my files in upload folder now when i create a link (<a href="http://mytest/public/1.jpg">download image</a>) to these files i get error "The requested URL /public/upload/1.jpg was not found on this server." how should i use routing for download these files? someone may help please thanks

    Read the article

  • How to resolve merging conflicts in Mercurial (v1.0.2)?

    - by lajos
    I have a merging conflict, using Mercurial 1.0.2: merging test.h warning: conflicts during merge. merging test.h failed! 6 files updated, 0 files merged, 0 files removed, 1 files unresolved There are unresolved merges, you can redo the full merge using: hg update -C 19 hg merge 18 I can't figure out how to resolve this. Google search results instruct to use: hg resolve but for some reason my Mercurial (v1.0.2) doesn't have a resolve command: hg: unknown command 'resolve' How can I resolve this conflict?

    Read the article

  • how to invoke an activity of a library project from an android apps

    - by Austin
    I have an open source android code that I need to use in my android apps. It has all the source code as well as resource files, manifest files and class path. It can be compiled as a separate android apps. I have constraints for using the open source. 1. I can't change a single line of code. 2. I can't use it as a separate apps. These constraints are non negotiable. What I have done is I have compiled the open source as class library(in Eclipse: Project Properties-Android- Tick check box Is Library). This has resulted in generation of .class files(in bin) for the java files and resource files. This open source has an android activity that i want to open from my application. So I have linked the directory of these sets of class files in the source section of my java build path( in .classpath). I have declared the activity in my manifest file with proper action intent filters. Now when I am trying to call activity from my code, its not working. Cleaning and rebuilding doesn't help. However, if I build the open source project and my apps in the same workspace of eclipse and link the open source in my apps in exact same manner it works fine. I am not able to identify the difference. All settings seems to be same(all files are identical in both the cases). But only in the second case it works. I have tried it as jar file also. I have build the open source as project library and exported it into a jar file(excluding manifest file). But in that case I am getting the following error UNEXPECTED TOP-LEVEL EXCEPTION: java.lang.IllegalArgumentException: already added: .... Conversion to Dalvik format failed with error 1 This I guess is coming because the android library(2.2) has been included twice in my apps( one for building my apps & another for building the open source). I dont know how to avoid this. Cleaning the project doesn't help. What i require is to use the open source and invoking it's activities in my apps without violating the constraints. If i can use the open source as bunch of .class files then great, or else any other way will do fine. Please look into it and let me know. Thanks

    Read the article

  • doxygen with IDL/ODL

    - by John
    If you have a C++ project that has a bunch of .ODL files and the generated .h files from the ODL compiler, should doxygen be told to parse both .odl and .h, or only one or the other? In general I don't like documenting generated code but IDL is sort of a special case. In any case, it seems like the member listing of ODL files is not quite working properly in my tests, are ODL files properly parsed?

    Read the article

  • RewriteRule - how to redirect from a folder within a folder to a new domain?

    - by eb_Dev
    Hi, I've been struggling with the following rule: RewriteRule ^subdomains/example.com/(.*)$ http://www.example.com/$1 [R=301,L] I'm trying to redirect anything that occurs after the folder /subdomains/example.com/ to http://www.example.com/ whilst including any filename or extra folder path information. E.g: www.olddomain.com/subdomains/example.com/index.html - www.example.com/index.html www.olddomain.com/subdomains/example.com/files/ - www.example.com/files/ www.olddomain.com/subdomains/example.com/files/index.html - www.example.com/files/index.html Any help would be greatly appreciated! Thanks, eb_dev

    Read the article

  • How do I force git to use LF under windows?

    - by Sorin Sbarnea
    I want to force git to checkout files under Windows using just LF not CR+LF. I checked the two configuration options but I was not able to find the right combination of settings. I want it to convert all files to LF and keep the LF on the files. Remark: I used autocrlf = input but this just repairs the files when you commit them. I want to force it to get them using 'LF'.

    Read the article

  • pre-update svn script to filter what get

    - by DrLuk
    Imagine a repository with many kind of files. Then, I want to get from this repository just some kind of files in a "filter process". I mean ALL FILES are versioned. But to my local work, I just wanna i.e get *.php files, ignoring download *.jpg instead. I think about client-site hook script (pre-update). Anyone know if is it possible? Thanks!

    Read the article

  • C++ code generation with Python

    - by norapinephrine
    Can anyone point me to some documentation on how to write scripts in Python (or Perl or any other Linux friendly script language) that generate C++ code from XML or py files from the command line. I'd like to be able to write up some xml files and then run a shell command that reads these files and generates .h files with fully inlined functions, e.g. streaming operators, constructors, etc.

    Read the article

  • remove the content in directory and subdirectory hierarichally with out distroy the directory structure

    - by user3713876
    In shell script, I want to clear only text files and log files in the following structure with out removing the directory as well as subdirectories | |------bar/ | |---file1.txt |---file2.txt | |---subdir1/ | |---file1.log | |---file2.log | |---subdir2/ |---image1.log |---image2.log I am using rm -rf /bar/* so I am getting the result as follows. |------bar/ but I want the output like following | |------bar/ | | | | |---subdir1/ | | | |---subdir2/ I want to remove only text files or log files or csv with out removing the directory and the subdirectories

    Read the article

  • Why I am getting "Problem loading the page" after enabling HTTPS for Apache on Windows 7?

    - by Anish
    I enabled HTTPS on the Apache server (2.2.15) Windows 7 Enterprise by uncommenting: Include /private/etc/apache2/extra/httpd-ssl.conf in C:\Program Files (x86)\Apache Software Foundation\Apache2.2\conf\httpd.conf and modifying C:\Program Files (x86)\Apache Software Foundation\Apache2.2\conf\httpd-ssl.conf to include: DocumentRoot "C:/Program Files (x86)/Apache Software Foundation/Apache2.2/htdocs" ServerName myserver.com:443 ServerAdmin [email protected] ... SSLCertificateFile "SSLCertificateFile "C:/Program Files (x86)/Apache Software Foundation/Apache2.2/conf/cert.pem SSLCertificateKeyFile "SSLCertificateFile "C:/Program Files (x86)/Apache Software Foundation/Apache2.2/conf/key.pem" Then I restart apache (going to start-All Progranms-Apache Server 2.2-Control-restart) and go to localhost on port 443 in Firefox , where I get: <<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN"> <html> <head> <title>Index of /</title> </head> <body> <h1>Index of /</h1> <ul><li><a href="MyPageLinks/"> Links/</a></li> ..... .... </ul> </body></html> But on Display of WebPage I see: Unable to connect Firefox can't establish a connection to the server at localhost. *The site could be temporarily unavailable or too busy. Try again in a few moments. *If you are unable to load any pages, check your computer's network onnection. *If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web. I read: Why am I getting 403 Forbidden after enabling HTTPS for Apache on Mac OS X? and added default web server configuration block to match my DocumentRoot The error Log C:\Program Files (x86)\Apache Software Foundation\Apache2.2\logs\error.log gives following error: The Apache2.2 service is running. (OS 5)Access is denied. : Init: Can't open server certificate file C:/Program Files (x86)/Apache Software Foundation/Apache2.2/conf/cert.pem I checked the permissions for cert.pem and it indicates: All the permissions (Full control, Read, Read and modify, execute, Write) are marked for Admin and I am currently logged in as Admin. I tried using oldcert.pem and oldkey.pem on the same server and it works fine. Is there anything that I missed?

    Read the article

  • How to setup Munin permissions?

    - by Mark Robinson
    I've just installed munin on my CentOS server but I can't get it to output anything to the html directory I set in /etc/munin/munin.conf htmldir /home/mydir/munin In /var/log/munin/munin-graph.log I get errors like: 2011/09/23 12:35:30 [RRD ERROR] Unable to graph /home/mydir/munin/localhost/localhost/memory-year.png : Opening '/home/mydir/munin/localhost/localhost/memory-year.png' for write: Permission denied permissions on /home/mydir/munin are: drwxrwxr-x 2 munin munin 4096 Sep 23 12:31 munin

    Read the article

  • Why do I keep getting this Imagemagick error?

    - by Gkhan14
    I currently used this code for Imagemagick: convert "c:\users\****\My Documents\test.png" -transparent white test2.png However, I keep on getting two errors that look like this: convert.exe: unable to open image `c:\users\****\My': No such file or directory @ error/blob.c/OpenBlob/2641. convert.exe: no decode delegate for this image format `c:\users\****\My' @ error/constitute.c/ReadImage/550. I installed Imagemagick to my c:\ directory. What does this mean, and how can I fix it?

    Read the article

  • The public link of dropbox doesn't work

    - by user828896
    Recently, I find the public link of dropbox such as https://dl.dropboxusercontent.com/u/209352/Temp/Mycss.PNG don't work sometimes. I have to use share dropbox link such as https://www.dropbox.com/s/nr4bb00sd80pgl9/Mycss.PNG It seems that share dropbox link works well. Could tell me what is different between public link and share dropbox link? Can I always use share dropbox link instead of public link ?

    Read the article

  • Slow wifi from Windows Server 2003 virtualized in XenServer

    - by John Clayton
    I'm a brand spanking new user of OS X, coming from a lifetime of Windows use. I've been setting up my new Macbook Pro and have run into a very unusual problem. Over wifi, I am unable to copy files to or from my Windows Home Server. The problem seems to exist only over wifi, and only to WHS. Here are the details of my setup: 2010 Macbook Pro (Core i7), OS X 10.6.3 Windows Home Server PP3 (virtualized in XenServer 5.5) Windows 7 Ultimate x64 desktop Windows 7 Ultimate x64 in Boot Camp D-Link DIR-655 wireless N router Here is what I've done to narrow down the problem: Files copy fine from WHS to OS X when using gigabit ethernet Files copy fine from desktop to OS X when using gigabit ethernet Files fail to copy from WHS to OS X when using wifi (error -51) Files copy fine from desktop to OS X when using wifi Files copy fine from WHS to Boot Camp when using wifi Files copy fine from desktop to Boot Camp when using wifi From what I can tell, it seems to be some sort of issue between OS X and WHS, but I can't for the life of me see what would be different between shares on WHS and my desktop. They are both connected using smb://ADDRESS (I've tried both by IP and name). I can browse the shares on the WHS, but copying to OS X fails. I originally found the issue while installing VS2010 off an ISO from WHS, mounted to a Windows 7 VM using VMware Fusion. During the installation the VM was unusable - even the clock got behind the host be about 8 minutes. Once I plugged in the ethernet and disabled the wifi things picked up and finished quickly. The Fusion 3.1 RC is the only I think of that I installed that may have messed with the wifi driver. I've also tried resetting the wifi router, and have changed it from being G & N to N-only. Under Boot Camp I get similar speeds as my wife's N laptop. Any ideas? Thanks! Update: The issue has been further narrowed down to Windows Server 2003, which Windows Home Server is based on, running in XenServer with the XenServer tools installed.

    Read the article

  • Windows 7 file-based backup service

    - by Ben Voigt
    I'm looking for a good replacement for Lazy Mirror, since it doesn't support Windows 7 well. Pros: One of the things I really loved about Lazy Mirror is that it always maintains a "full" backup, but does so by only copying modified files. As each file was copied, the old version got archived (moved to an out-of-the-way location). So after mirroring ran, there'd be a complete copy of the file system, which could even be booted if necessary. At the same time, extra space on the backup media was used to store as many older versions of files as possible, without wasting space storing multiple copies of the same version. It seems that with Windows 7 backup, there'd be wasted space storing the same data in both the system image and file backup. It was completely file-based, but also aware of the registry (it had a feature to dump the live registry to hive files in the correct format). The backups were normal NTFS filesystems, no special tool was needed to read them. It automatically cleaned out the oldest previous versions when space ran out (unlike Windows 7 backup which apparently simply starts failing the the backup media fills.) It copied all file attributes including security. Cons: It doesn't deal well with junction points, symbolic links, and hard links. It didn't run as a service without lots of help from firesrv or srvany, and then you couldn't interact with the GUI. Running as a service was necessary to be able to mirror protected OS files. It didn't have open file handling, except for registry hives. I guess that the file-by-file archive and replacement could leave mismatched sets of files, if the mirror was interrupted. This would be the advantage of incremental backup techniques that require old full backup + all intermediate incremental backups to restore. But I don't see this as presenting much of a problem, you'd really only have a boot failure if you had a mixture of pre- and post-service pack files, and I can run a full image backup using another tool before applying a service pack. Does anyone know of a tool that does both full-system backup and storage of old versions of files like Lazy Mirror did (without storing the same data multiple times), and also can run as a service in Windows 7? Free is best of course, but a reasonably priced paid program (e.g. It would be absolutely awesome if it also triggered a backup/mirror pass when a particular external drive was plugged in and generated popup warnings if backups hadn't been run recently)

    Read the article

  • Nginx rewrite URL only if file exists

    - by Jose Fernandez
    I need to write a rewrite rule for Nginx so that if a user tries to go to an old image url: /images/path/to/image.png and the file doesnt exist, try to redirect to: /website_images/path/to/image.png ONLY if the image exists in the new URL, otherwise continue with the 404. The version of Nginx on our host doesn't have try_files yet.

    Read the article

  • Colors are not displayed correctly in GVim on some computers

    - by MARTIN Damien
    I try to use a colorscheme. On my desktop it looks like how it should be : https://github.com/martin-damien/tetrisity-vim/blob/master/tetrisity-vim.png But on my laptop, I have the following colors : http://img703.imageshack.us/img703/8444/errorufl.png Has you can see the most simple and visible point is in comments. The should be grey on black and they finaly are blue on transparent. What could make such errors ?

    Read the article

  • nginx proxy to different path

    - by David Robertson
    I've read through the documentation for nginx's HttpProxyModule, but I can't figure this out: I want it so that if someone visits, for example http://ss.example.com/1339850978, nginx will proxy them http://dl.dropbox.com/u/xxxxx/screenshots/1339850978.png. If I was to just use this line in my config file: proxy_pass http://dl.dropbox.com/u/xxxxx/screenshots/;, then they would have to append the .png themselves. tia, David.

    Read the article

  • APC File Cache not working but user cache is fine

    - by danishgoel
    I have just got a VPS (with cPanel/WHM) to test what gains i could get in my application with using apc file cache AND user cache. So firstly I got the PHP 5.3 compiled in as a DSO (apache module). Then installed APC via PECL through SSH. (First I tried with WHM Module installer, it also had the same problem, so I tried it via ssh) All seemed fine and phpinfo showed apc loaded and enabled. Then I checked with apc.php. All seemed OK But as I started testing my php application, the stats in apc for File Cache Information state: Cached Files 0 ( 0.0 Bytes) Hits 1 Misses 0 Request Rate (hits, misses) 0.00 cache requests/second Hit Rate 0.00 cache requests/second Miss Rate 0.00 cache requests/second Insert Rate 0.00 cache requests/second Cache full count 0 Which meant no PHP files were being cached, even though I had browsed through over 10 PHP files having multiple includes. So there must have been some Cached Files. But the user cache is functioning fine. User Cache Information Cached Variables 0 ( 0.0 Bytes) Hits 1000 Misses 1000 Request Rate (hits, misses) 0.84 cache requests/second Hit Rate 0.42 cache requests/second Miss Rate 0.42 cache requests/second Insert Rate 0.84 cache requests/second Cache full count 0 Its actually from an APC caching test script which tries to retrieve and store 1000 entries and gives me the times. A sort of simple benchmark. Can anyone help me here. Even though apc.cache_by_default = 1, no php files are being cached. This is my apc config Runtime Settings apc.cache_by_default 1 apc.canonicalize 1 apc.coredump_unmap 0 apc.enable_cli 0 apc.enabled 1 apc.file_md5 0 apc.file_update_protection 2 apc.filters apc.gc_ttl 3600 apc.include_once_override 0 apc.lazy_classes 0 apc.lazy_functions 0 apc.max_file_size 1M apc.mmap_file_mask apc.num_files_hint 1000 apc.preload_path apc.report_autofilter 0 apc.rfc1867 0 apc.rfc1867_freq 0 apc.rfc1867_name APC_UPLOAD_PROGRESS apc.rfc1867_prefix upload_ apc.rfc1867_ttl 3600 apc.serializer default apc.shm_segments 1 apc.shm_size 32M apc.slam_defense 1 apc.stat 1 apc.stat_ctime 0 apc.ttl 0 apc.use_request_time 1 apc.user_entries_hint 4096 apc.user_ttl 0 apc.write_lock 1 Also most php files are under 20KB, thus, apc.max_file_size = 1M is not the cause. I have also tried using 'apc_compile_file ' to force some files into opcode cache with no luck. I have also re-installed APC with Debugging enabled, but nothing shows in the error_log I have also tried setting mmap_file_mask to /dev/zero and /tmp/apc.xxxxxx, i have also set /tmp permissions to 777 to no avail Any clue anyone. Update: I have tried following things and none cause APC file cache to populate 1. set apc.enable_cli = 1 AND run a script from cli 2. Set apc.max_file_size = 5M (just in case) 3. switched php handler from dso to FastCGI in WHM (then switched it back to dso as it did not solve the problem) 4. Even tried restarting the container

    Read the article

  • port is not open in torrent client even though it was forwarded

    - by aukxn
    I have a problem with port forwarding with my torrent, port check tool says that it open, the firewall exception as well. But with the test by the client not say so. I don't know why. Can anybody help me fix me this problem, the download and upload speed with torrent is very slow. ![enter image description here][1] I don't have enough reputation to post images so here is the link of the image http://i.stack.imgur.com/tgOdr.png http://i.stack.imgur.com/OgjX4.png

    Read the article

  • Copy images using a single dos command

    - by Haroon
    Hi guys, I'm wondering if it's possible to copy only images files from a directory. For example, if source directory has: a.jpg b.gif c.png d.txt I want to copy only image (using one command), to get this in the destination directory: a.jpg b.gif c.png

    Read the article

< Previous Page | 360 361 362 363 364 365 366 367 368 369 370 371  | Next Page >