Search Results

Search found 11542 results on 462 pages for 'download'.

Page 76/462 | < Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >

  • Need help in setting lighttpd on Ubuntu 9.10

    - by hap497
    Hi, I am trying to run lighttpd on Ubuntu 9.10. I get the conf file from the doc directory of lighttpd source. $ sudo ./lighttpd -f lighttpd.conf $ ps -ef | grep lighttpd root 2094 1 0 19:40 ? 00:00:00 ./lighttpd -f lighttpd.conf This is my lighttpd.conf: $ more lighttpd.conf # lighttpd configuration file # # use it as a base for lighttpd 1.0.0 and above # # $Id: lighttpd.conf,v 1.7 2004/11/03 22:26:05 weigon Exp $ ############ Options you really have to take care of #################### ## modules to load # at least mod_access and mod_accesslog should be loaded # all other module should only be loaded if really neccesary # - saves some time # - saves memory server.modules = ( # "mod_rewrite", # "mod_redirect", # "mod_alias", "mod_access", # "mod_trigger_b4_dl", # "mod_auth", # "mod_status", # "mod_setenv", # "mod_fastcgi", # "mod_proxy", # "mod_simple_vhost", # "mod_evhost", # "mod_userdir", # "mod_cgi", # "mod_compress", # "mod_ssi", # "mod_usertrack", # "mod_expire", # "mod_secdownload", # "mod_rrdtool", "mod_accesslog" ) ## A static document-root. For virtual hosting take a look at the ## mod_simple_vhost module. server.document-root = "/srv/www/htdocs/" ## where to send error-messages to server.errorlog = "/var/log/lighttpd/error.log" # files to check for if .../ is requested index-file.names = ( "index.php", "index.html", "index.htm", "default.htm" ) ## set the event-handler (read the performance section in the manual) # server.event-handler = "freebsd-kqueue" # needed on OS X # mimetype mapping mimetype.assign = ( ".pdf" => "application/pdf", ".sig" => "application/pgp-signature", ".spl" => "application/futuresplash", ".class" => "application/octet-stream", ".ps" => "application/postscript", ".torrent" => "application/x-bittorrent", ".dvi" => "application/x-dvi", ".gz" => "application/x-gzip", ".pac" => "application/x-ns-proxy-autoconfig", ".swf" => "application/x-shockwave-flash", ".tar.gz" => "application/x-tgz", ".tgz" => "application/x-tgz", ".tar" => "application/x-tar", ".zip" => "application/zip", ".mp3" => "audio/mpeg", ".m3u" => "audio/x-mpegurl", ".wma" => "audio/x-ms-wma", ".wax" => "audio/x-ms-wax", ".ogg" => "application/ogg", ".wav" => "audio/x-wav", ".gif" => "image/gif", ".jar" => "application/x-java-archive", ".jpg" => "image/jpeg", ".jpeg" => "image/jpeg", ".png" => "image/png", ".xbm" => "image/x-xbitmap", ".xpm" => "image/x-xpixmap", ".xwd" => "image/x-xwindowdump", ".css" => "text/css", ".html" => "text/html", ".htm" => "text/html", ".js" => "text/javascript", ".asc" => "text/plain", ".c" => "text/plain", ".cpp" => "text/plain", ".log" => "text/plain", ".conf" => "text/plain", ".text" => "text/plain", ".txt" => "text/plain", ".dtd" => "text/xml", ".xml" => "text/xml", ".mpeg" => "video/mpeg", ".mpg" => "video/mpeg", ".mov" => "video/quicktime", ".qt" => "video/quicktime", ".avi" => "video/x-msvideo", ".asf" => "video/x-ms-asf", ".asx" => "video/x-ms-asf", ".wmv" => "video/x-ms-wmv", ".bz2" => "application/x-bzip", ".tbz" => "application/x-bzip-compressed-tar", ".tar.bz2" => "application/x-bzip-compressed-tar", # default mime type "" => "application/octet-stream", ) # Use the "Content-Type" extended attribute to obtain mime type if possible #mimetype.use-xattr = "enable" ## send a different Server: header ## be nice and keep it at lighttpd # server.tag = "lighttpd" #### accesslog module accesslog.filename = "/var/log/lighttpd/access.log" ## deny access the file-extensions # # ~ is for backupfiles from vi, emacs, joe, ... # .inc is often used for code includes which should in general not be part # of the document-root url.access-deny = ( "~", ".inc" ) $HTTP["url"] =~ "\.pdf$" { server.range-requests = "disable" } ## # which extensions should not be handle via static-file transfer # # .php, .pl, .fcgi are most often handled by mod_fastcgi or mod_cgi static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) ######### Options that are good to be but not neccesary to be changed ####### ## bind to port (default: 80) #server.port = 81 ## bind to localhost (default: all interfaces) #server.bind = "127.0.0.1" ## error-handler for status 404 #server.error-handler-404 = "/error-handler.html" #server.error-handler-404 = "/error-handler.php" ## to help the rc.scripts #server.pid-file = "/var/run/lighttpd.pid" ###### virtual hosts ## ## If you want name-based virtual hosting add the next three settings and load ## mod_simple_vhost ## ## document-root = ## virtual-server-root + virtual-server-default-host + virtual-server-docroot ## or ## virtual-server-root + http-host + virtual-server-docroot ## #simple-vhost.server-root = "/srv/www/vhosts/" #simple-vhost.default-host = "www.example.org" #simple-vhost.document-root = "/htdocs/" ## ## Format: <errorfile-prefix><status-code>.html ## -> ..../status-404.html for 'File not found' #server.errorfile-prefix = "/usr/share/lighttpd/errors/status-" #server.errorfile-prefix = "/srv/www/errors/status-" ## virtual directory listings #dir-listing.activate = "enable" ## select encoding for directory listings #dir-listing.encoding = "utf-8" ## enable debugging #debug.log-request-header = "enable" #debug.log-response-header = "enable" #debug.log-request-handling = "enable" #debug.log-file-not-found = "enable" ### only root can use these options # # chroot() to directory (default: no chroot() ) #server.chroot = "/" ## change uid to <uid> (default: don't care) #server.username = "wwwrun" ## change uid to <uid> (default: don't care) #server.groupname = "wwwrun" #### compress module #compress.cache-dir = "/var/cache/lighttpd/compress/" #compress.filetype = ("text/plain", "text/html") #### proxy module ## read proxy.txt for more info #proxy.server = ( ".php" => # ( "localhost" => # ( # "host" => "192.168.0.101", # "port" => 80 # ) # ) # ) #### fastcgi module ## read fastcgi.txt for more info ## for PHP don't forget to set cgi.fix_pathinfo = 1 in the php.ini #fastcgi.server = ( ".php" => # ( "localhost" => # ( # "socket" => "/var/run/lighttpd/php-fastcgi.s ocket", # "bin-path" => "/usr/local/bin/php-cgi" # ) # ) # ) #### CGI module #cgi.assign = ( ".pl" => "/usr/bin/perl", # ".cgi" => "/usr/bin/perl" ) # #### SSL engine #ssl.engine = "enable" #ssl.pemfile = "/etc/ssl/private/lighttpd.pem" #### status module #status.status-url = "/server-status" #status.config-url = "/server-config" #### auth module ## read authentication.txt for more info #auth.backend = "plain" #auth.backend.plain.userfile = "lighttpd.user" #auth.backend.plain.groupfile = "lighttpd.group" #auth.backend.ldap.hostname = "localhost" #auth.backend.ldap.base-dn = "dc=my-domain,dc=com" #auth.backend.ldap.filter = "(uid=$)" #auth.require = ( "/server-status" => # ( # "method" => "digest", # "realm" => "download archiv", # "require" => "user=jan" # ), # "/server-config" => # ( # "method" => "digest", # "realm" => "download archiv", # "require" => "valid-user" # ) # ) #### url handling modules (rewrite, redirect, access) #url.rewrite = ( "^/$" => "/server-status" ) #url.redirect = ( "^/wishlist/(.+)" => "http://www.123.org/$1" ) #### both rewrite/redirect support back reference to regex conditional using %n #$HTTP["host"] =~ "^www\.(.*)" { # url.redirect = ( "^/(.*)" => "http://%1/$1" ) #} # # define a pattern for the host url finding # %% => % sign # %0 => domain name + tld # %1 => tld # %2 => domain name without tld # %3 => subdomain 1 name # %4 => subdomain 2 name # #evhost.path-pattern = "/srv/www/vhosts/%3/htdocs/" #### expire module #expire.url = ( "/buggy/" => "access 2 hours", "/asdhas/" => "ac cess plus 1 seconds 2 minutes") #### ssi #ssi.extension = ( ".shtml" ) #### rrdtool #rrdtool.binary = "/usr/bin/rrdtool" #rrdtool.db-name = "/var/lib/lighttpd/lighttpd.rrd" #### setenv #setenv.add-request-header = ( "TRAV_ENV" => "mysql://user@host/db" ) #setenv.add-response-header = ( "X-Secret-Message" => "42" ) ## for mod_trigger_b4_dl # trigger-before-download.gdbm-filename = "/var/lib/lighttpd/trigger.db" # trigger-before-download.memcache-hosts = ( "127.0.0.1:11211" ) # trigger-before-download.trigger-url = "^/trigger/" # trigger-before-download.download-url = "^/download/" # trigger-before-download.deny-url = "http://127.0.0.1/index.html" # trigger-before-download.trigger-timeout = 10 #### variable usage: ## variable name without "." is auto prefixed by "var." and becomes "var.bar" #bar = 1 #var.mystring = "foo" ## integer add #bar += 1 ## string concat, with integer cast as string, result: "www.foo1.com" #server.name = "www." + mystring + var.bar + ".com" ## array merge #index-file.names = (foo + ".php") + index-file.names #index-file.names += (foo + ".php") #### include #include /etc/lighttpd/lighttpd-inc.conf ## same as above if you run: "lighttpd -f /etc/lighttpd/lighttpd.conf" #include "lighttpd-inc.conf" #### include_shell #include_shell "echo var.a=1" ## the above is same as: #var.a=1 When I go to browser and hit 'http://127.0.0.1', I get link not found. Any idea?

    Read the article

  • Error installing Sony Remote Play

    - by Iszi Rory or Isznti
    I'm trying to install Remote Play software to connect my laptop to my PS3. I've found a guide with instructions which seem to be in fairly wide use (found similar walk-throughs on numerous other sites), for running the software on a non-Vaio PC. Tech-Recipies: Playstation 3 – Use Remote Play on any Windows 7 PC The setup essentially goes like this: Download Remote Play software. Download patch by NTAuthority. Install Remote Play as normal. Reboot. Extract NTAuthority patch to Remote Play program folder. Manually register patched DLLs via CLI. Run Remote Play software. Sadly, my problem is early in - Step 3. I had to use Google to find the software download, as the link from Tech-Recipies seems broken. I found the download on Sony's site here: Sony eSupport: Remote Play with PlayStation®3 After downloading and running the software, I hit "Next" at the welcome screen and "I Agree" at the EULA screen. After this, a popup informs me that Setup is checking my computer's information. Then, Setup terminates with this error: I'm running Windows 7 Ultimate x64. Is anyone familiar with this error in this software? Is there a way to work around it? Did I perhaps pick the wrong download from Sony's site?

    Read the article

  • How to place a dropbox file into other dropbox

    - by Nrew
    There is this file that I really want to download. The kahel os live cd I tried downloading it with 45kbps download rate and it would take 3 hours. And the connection is intermittent. So my download was cut. Is it possible to treat the file as your own and then put it to your own dropbox. And from there it would continue downloading.

    Read the article

  • Virtual Machines and Automatic Software Updates

    - by Zian Choy
    It's obvious that one's main computer should always be have all the latest security patches and most people don't blink an eye when Microsoft Update installs non-security updates. In the land of virtual machines, I've run into 2 problems with automatic updates: The virtual machines are only run when needed. Only Windows virtual machines seem to patch themselves. To elaborate on #1, I generally make a virtual machine with a purpose in mind. For example, when I needed an old copy of Internet Explorer to reproduce a bug in RSS Bandit, I had a Virtual PC named RSS Bandit. The machine only stayed running for a few minutes at a time. Consequently, there is no downtime for the machine to download updates at 3 AM. To elaborate on #2, I've noticed that if I haven't run a Windows virtual machine in a while, then the moment I log in, the computer frantically downloads updates and within seconds, if I click the Start button, there is a little orange shield next to the "Shutdown" button. However, I ran a freshly created Ubuntu VM for several hours today with hundreds of updates pending and it seemed to never download any of them or install any of them. Is there any reason to be concerned about running VMs with dozens of security holes? If I should be concerned, then is there any way to get Ubuntu to download and install updates rather than just advertising a long list of updates to download next century? I've already tried telling Ubuntu to automatically download and install updates.

    Read the article

  • Problems installing Adobe Premiere Elements 8 Content features

    - by Walt Maken
    I recently purchased Adobe Photoshop & Premiere Elements 8. Photoshop Elements 8 installed fine, along with the additional material available via internet download. Premiere Elements 8 installed ok from the DVD. During the Premiere Elements 8 startup process the following message appears: A reduced set of content (Instant Movie Themes, Title and Menu Templates,etc) has been installed. To install the full content set, please insert your Content DVD and run Setup.exe. If you do not have a Content DVD please visit http://www.adobe.com/go/pre_additional_downloads to download the content installer. Since I didn't have the Content DVD, I did the download, which took nearly 12 hours. The extract appeared to complete at 100%, but then immediately gave the error message "A problem occurred while extracting some files. Check available space on your computer and the write privileges on the destination folder." Why would it show this error message if it had completed the extract process 100%? What step(s) do I take now to have Content installed? Do I need to go thru the 12 hour download again or, hopefully, is there something I can do that will make it unnecessary to download again?

    Read the article

  • Limit Windows PC Network/Internet Throughput

    - by Jon Cram
    I have a Vista x64 machine on a fairly fast Internet connection and either buggy drivers for the onboard Ethernet or faulty onboard Ethernet hardware. If I sustain too high a throughput on the Ethernet connection the network connection within Windows fails and I have to restart the machine to restore connectivity. I don't believe I can fix this issue (I'm erring towards faulty hardware) but would like to mitigate the effects by limiting my network throughput. I'm in a position where I would like to download a 5GB file from the Internet (a game install via Steam) and am certain that as this will take a few hours I will not be able to complete the download before my network connection within Windows fails. From downloading content through a BitTorrent client I have found that by limiting the download throughput to around 150 kilobytes per second I can maintain a steady network connection. I can't directly limit the throughput of the download through the Steam client and would instead like to find out how I can limit the throughput of my Ethernet connection within Windows. Any suggestions on how I can achieve this?

    Read the article

  • Is it possible to install Canon EOS T3 Camera Drivers on Windows 7 Embedded

    - by Ryan Johnson
    I have a computer running Windows 7 Embedded. It's an embedded system that will be used in a industrial setting. It needs to be connected to a Canon EOS camera and download pictures from the camera. Other versions of windows come with the Canon drivers so Canon does not provide them as a download on their website. In the past, I had a similar issue with the "N" version of Microsoft Windows Starter and had to download the "Microsoft Media Feature Pack" which then installed the drivers. I attempted to install that on this device, but understandably it complains that its not applicable for this version of windows. So, is there there a feature pack or some other sort of download available that will install the camera drivers? Alternatively, is there some place to get the drivers and manually install them. Thanks in Advance, Ryan

    Read the article

  • don't want Folx become my default downloader

    - by Am1rr3zA
    Hi every one, I install Folx download manager in my macbookPro and every time I want to download a link in safari in force me to download with folx, how can I set setting that let me choose the downloader (default safari downloader or folx)? can any one introduce better free downloader than Folx for OSX?

    Read the article

  • LS command for torrent files

    - by amir-beygi
    Hi all I have a directory full of torrent files,and i have to download all of them; But the problem is i have disk limit in my remote server,and file sizes are vary(100MB~8GB) and if i add all of torrent files ,none of them would be download completely;So i need a command to list all my torrents and the size of them , to be selected and add to download list later . NOTE: REMOTE SERVER - LINUX_UBUNTU_9.10 // SSH So i need a command like torrentls That output somethings like: file1.torrent 1111MB file2.torrent 222MB file3.torrent 3333MB file4.torrent 444MB file5.torrent 5555MB

    Read the article

  • Improving abysmal 802.11n wireless network

    - by concept
    I am in desperate need of help to improve the abysmal performance of my 802.11n wireless network. At best I get 30Mbs (this is an internet download) from a technology that boasts 300Mbs, even worse is the LAN where to date best i have ever gotten is 1Mbs. It is literally quicker to copy the file to a USB and walk it to the other computer. Infrastructure is this AP 802.11n only broadcasting at both 2.4GHz and 5GHz Mac with 802.11a/b/g/n card is connected to the AP via 5GHz Linux with 802.11a/b/g/n card is connected to AP via 2.4GHz I have conducted the following tests (results at end of post) Internet based speed test wired and wireless LAN file copy wired and wireless I have read: http://nutsaboutnets.com/troubleshooting-wi-fi-problems/ http://www.smallnetbuilder.com/wireless/wireless-basics/30664-5-ways-to-fix-slow-80211n-- speed http colon //www.wi-fiplanet dot com/tutorials/7-tips-to-increase-wi-fi-performance.html Slow file transfer on network between two 802.11n laptops (connected directly together via access point) Wireless Network Performance Issues Slower than expected 802.11n wireless network speeds I have made the following optimizations AP broadcasts only 802.11n on both 2.4GHz and 5GHz frequencies 2.4GHz is on a channel with least interference (live in an apartment with lots of APs), this did make a 10Mb/sec improvement Our AP is the only one transmitting on the 5GHz freq. Security: WPA Personal WPA2 AES encryption Bandwidth: 20MHz / 40MHz (i assume this to be channel bonding) I have tried the following with 0 improvement Dropped the Fragment Threshold to 512 Dropped the Request To Send (RTS) Threshold to 512 and 1 Even thought of buying a frequency spectrum analyzer, until i saw the cost of them!!! Speed test results Linux Wired: DOWNLOAD 128.40Mb/s UPLOAD 10.62Mb/s www dot speedtest dot net/my-result/2948381853 Mac Wired: DOWNLOAD 118.02Mb/s UPLOAD 10.56Mb/s www dot speedtest dot net/my-result/2948384406 Linux Wireless: DOWNLOAD 23.99Mb/s UPLOAD 10.31Mb/s www.speedtest dot net/my-result/2948394990 Mac Wireless: DOWNLOAD 22.55Mb/s UPLOAD 10.36Mb/s www.speedtest dot net/my-result/2948396489 LAN NFS 53,345,087 bytes (51Mb) file Linux Mac NFS Wired: 65.6959 Mb/sec Linux Mac NFS Wireless: .9443 Mb/sec All help is appreciated, even testing methods will be accepted.

    Read the article

  • ServerRoot in my lighttpd.conf

    - by michael
    Hi, I have use the following example lighttpd.conf to launch my lighttpd. Can you please tell me where is my 'ServerRoot'? # lighttpd configuration file # # use it as a base for lighttpd 1.0.0 and above # # $Id: lighttpd.conf,v 1.7 2004/11/03 22:26:05 weigon Exp $ ############ Options you really have to take care of #################### ## modules to load # at least mod_access and mod_accesslog should be loaded # all other module should only be loaded if really neccesary # - saves some time # - saves memory server.modules = ( # "mod_rewrite", # "mod_redirect", # "mod_alias", "mod_access", # "mod_trigger_b4_dl", # "mod_auth", # "mod_status", # "mod_setenv", "mod_fastcgi", # "mod_proxy", # "mod_simple_vhost", # "mod_evhost", # "mod_userdir", # "mod_cgi", # "mod_compress", # "mod_ssi", # "mod_usertrack", # "mod_expire", # "mod_secdownload", # "mod_rrdtool", "mod_accesslog" ) ## A static document-root. For virtual hosting take a look at the ## mod_simple_vhost module. server.document-root = "/srv/www/htdocs/" ## where to send error-messages to server.errorlog = "/var/log/lighttpd/error.log" # files to check for if .../ is requested index-file.names = ( "index.php", "index.html", "index.htm", "default.htm" ) ## set the event-handler (read the performance section in the manual) # server.event-handler = "freebsd-kqueue" # needed on OS X # mimetype mapping mimetype.assign = ( ".pdf" => "application/pdf", ".sig" => "application/pgp-signature", ".spl" => "application/futuresplash", ".class" => "application/octet-stream", ".ps" => "application/postscript", ".torrent" => "application/x-bittorrent", ".dvi" => "application/x-dvi", ".gz" => "application/x-gzip", ".pac" => "application/x-ns-proxy-autoconfig", ".swf" => "application/x-shockwave-flash", ".tar.gz" => "application/x-tgz", ".tgz" => "application/x-tgz", ".tar" => "application/x-tar", ".zip" => "application/zip", ".mp3" => "audio/mpeg", ".m3u" => "audio/x-mpegurl", ".wma" => "audio/x-ms-wma", ".wax" => "audio/x-ms-wax", ".ogg" => "application/ogg", ".wav" => "audio/x-wav", ".gif" => "image/gif", ".jar" => "application/x-java-archive", ".jpg" => "image/jpeg", ".jpeg" => "image/jpeg", ".png" => "image/png", ".xbm" => "image/x-xbitmap", ".xpm" => "image/x-xpixmap", ".xwd" => "image/x-xwindowdump", ".css" => "text/css", ".html" => "text/html", ".htm" => "text/html", ".js" => "text/javascript", ".asc" => "text/plain", ".c" => "text/plain", ".cpp" => "text/plain", ".log" => "text/plain", ".conf" => "text/plain", ".text" => "text/plain", ".txt" => "text/plain", ".dtd" => "text/xml", ".xml" => "text/xml", ".mpeg" => "video/mpeg", ".mpg" => "video/mpeg", ".mov" => "video/quicktime", ".qt" => "video/quicktime", ".avi" => "video/x-msvideo", ".asf" => "video/x-ms-asf", ".asx" => "video/x-ms-asf", ".wmv" => "video/x-ms-wmv", ".bz2" => "application/x-bzip", ".tbz" => "application/x-bzip-compressed-tar", ".tar.bz2" => "application/x-bzip-compressed-tar", # default mime type "" => "application/octet-stream", ) # Use the "Content-Type" extended attribute to obtain mime type if possible #mimetype.use-xattr = "enable" ## send a different Server: header ## be nice and keep it at lighttpd # server.tag = "lighttpd" #### accesslog module accesslog.filename = "/var/log/lighttpd/access.log" ## deny access the file-extensions # # ~ is for backupfiles from vi, emacs, joe, ... # .inc is often used for code includes which should in general not be part # of the document-root url.access-deny = ( "~", ".inc" ) $HTTP["url"] =~ "\.pdf$" { server.range-requests = "disable" } ## # which extensions should not be handle via static-file transfer # # .php, .pl, .fcgi are most often handled by mod_fastcgi or mod_cgi static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) ######### Options that are good to be but not neccesary to be changed ####### ## bind to port (default: 80) server.port = 9090 ## bind to localhost (default: all interfaces) server.bind = "127.0.0.1" ## error-handler for status 404 #server.error-handler-404 = "/error-handler.html" #server.error-handler-404 = "/error-handler.php" ## to help the rc.scripts #server.pid-file = "/var/run/lighttpd.pid" ###### virtual hosts ## ## If you want name-based virtual hosting add the next three settings and load ## mod_simple_vhost ## ## document-root = ## virtual-server-root + virtual-server-default-host + virtual-server-docroot ## or ## virtual-server-root + http-host + virtual-server-docroot ## #simple-vhost.server-root = "/srv/www/vhosts/" #simple-vhost.default-host = "www.example.org" #simple-vhost.document-root = "/htdocs/" ## ## Format: <errorfile-prefix><status-code>.html ## -> ..../status-404.html for 'File not found' #server.errorfile-prefix = "/usr/share/lighttpd/errors/status-" #server.errorfile-prefix = "/srv/www/errors/status-" ## virtual directory listings #dir-listing.activate = "enable" ## select encoding for directory listings #dir-listing.encoding = "utf-8" ## enable debugging #debug.log-request-header = "enable" #debug.log-response-header = "enable" #debug.log-request-handling = "enable" #debug.log-file-not-found = "enable" ### only root can use these options # # chroot() to directory (default: no chroot() ) #server.chroot = "/" ## change uid to <uid> (default: don't care) #server.username = "wwwrun" ## change uid to <uid> (default: don't care) #server.groupname = "wwwrun" #### compress module #compress.cache-dir = "/var/cache/lighttpd/compress/" #compress.filetype = ("text/plain", "text/html") #### proxy module ## read proxy.txt for more info #proxy.server = ( ".php" => # ( "localhost" => # ( # "host" => "192.168.0.101", # "port" => 80 # ) # ) # ) #### fastcgi module fastcgi.server = ( "/fastcgi_scripts/" => (( "host" => "127.0.0.1", "port" => 1026, "check-local" => "disable", "bin-path" => "/usr/local/bin/cgi-fcgi", #"docroot" => "/" # remote server may use # it's own docroot )) ) ## read fastcgi.txt for more info ## for PHP don't forget to set cgi.fix_pathinfo = 1 in the php.ini #fastcgi.server = ( ".php" => # ( "localhost" => # ( # "socket" => "/var/run/lighttpd/php-fastcgi.socket", # "bin-path" => "/usr/local/bin/php-cgi" # ) # ) # ) #### CGI module #cgi.assign = ( ".pl" => "/usr/bin/perl", # ".cgi" => "/usr/bin/perl" ) # #### SSL engine #ssl.engine = "enable" #ssl.pemfile = "/etc/ssl/private/lighttpd.pem" #### status module #status.status-url = "/server-status" #status.config-url = "/server-config" #### auth module ## read authentication.txt for more info #auth.backend = "plain" #auth.backend.plain.userfile = "lighttpd.user" #auth.backend.plain.groupfile = "lighttpd.group" #auth.backend.ldap.hostname = "localhost" #auth.backend.ldap.base-dn = "dc=my-domain,dc=com" #auth.backend.ldap.filter = "(uid=$)" #auth.require = ( "/server-status" => # ( # "method" => "digest", # "realm" => "download archiv", # "require" => "user=jan" # ), # "/server-config" => # ( # "method" => "digest", # "realm" => "download archiv", # "require" => "valid-user" # ) # ) #### url handling modules (rewrite, redirect, access) #url.rewrite = ( "^/$" => "/server-status" ) #url.redirect = ( "^/wishlist/(.+)" => "http://www.123.org/$1" ) #### both rewrite/redirect support back reference to regex conditional using %n #$HTTP["host"] =~ "^www\.(.*)" { # url.redirect = ( "^/(.*)" => "http://%1/$1" ) #} # # define a pattern for the host url finding # %% => % sign # %0 => domain name + tld # %1 => tld # %2 => domain name without tld # %3 => subdomain 1 name # %4 => subdomain 2 name # #evhost.path-pattern = "/srv/www/vhosts/%3/htdocs/" #### expire module #expire.url = ( "/buggy/" => "access 2 hours", "/asdhas/" => "access plus 1 seconds 2 minutes") #### ssi #ssi.extension = ( ".shtml" ) #### rrdtool #rrdtool.binary = "/usr/bin/rrdtool" #rrdtool.db-name = "/var/lib/lighttpd/lighttpd.rrd" #### setenv #setenv.add-request-header = ( "TRAV_ENV" => "mysql://user@host/db" ) #setenv.add-response-header = ( "X-Secret-Message" => "42" ) ## for mod_trigger_b4_dl # trigger-before-download.gdbm-filename = "/var/lib/lighttpd/trigger.db" # trigger-before-download.memcache-hosts = ( "127.0.0.1:11211" ) # trigger-before-download.trigger-url = "^/trigger/" # trigger-before-download.download-url = "^/download/" # trigger-before-download.deny-url = "http://127.0.0.1/index.html" # trigger-before-download.trigger-timeout = 10 #### variable usage: ## variable name without "." is auto prefixed by "var." and becomes "var.bar" #bar = 1 #var.mystring = "foo" ## integer add #bar += 1 ## string concat, with integer cast as string, result: "www.foo1.com" #server.name = "www." + mystring + var.bar + ".com" ## array merge #index-file.names = (foo + ".php") + index-file.names #index-file.names += (foo + ".php") #### include #include /etc/lighttpd/lighttpd-inc.conf ## same as above if you run: "lighttpd -f /etc/lighttpd/lighttpd.conf" #include "lighttpd-inc.conf" #### include_shell #include_shell "echo var.a=1" ## the above is same as: #var.a=1 Thank you.

    Read the article

  • QR Codes for Files on Google Code

    - by Synetech inc.
    Hi, When you download files from Google Code now (example), in addition to the text version of the SHA1 hash, it includes a QR code of it. The device that the file was downloaded to is the one that has to hash the file. But, if it can download the file (ie, has access to the webpage), it also has access to the text version of the hash, so the QR code seems completely useless—and more work to decode when the raw text is available. How would reading the hash into a mobile phone allow you to verify the file you download to the computer? Or if you download the file to the phone, how would you use the phone to take a picture of the QR code displayed on the webpage on its own screen? Does anyone know what the point to the QR code is or how you would use it to verify the downloaded file (I don’t mean QR codes in general, but specifically in this context).

    Read the article

  • utorrent does not work with proxy server

    - by developer
    I have utorrent 3.2.2(build 28500) 32-bit. I am trying to download torrent using a proxy server but nothing is working. It shows that you have a wrong network configuration. But the same server settings is working for Google chrome and Internet Download Manager. How to do it ? Also one questions: Any way to convert torrent to direct download other than zbigz.com, torrific.com and torcache.com ( i tried them, not working)?

    Read the article

  • Cisco VPN Client For OS X requires a software agreement in place?

    - by JT
    Hi All, I am trying to download the latest Cisco VPN Client for OS X. I get here:http://www.versiontracker.com/dyn/moreinfo/macosx/12696 I click to download, I get redirected to Cisco, I create an account, try to download again and they tell me that I need to have a valid technical support agreement to get access to the software. Really? How do they expect us to VPN into client networks?

    Read the article

  • Torrent upload ratio not updated on Synology DS212+

    - by user179271
    I have a Synology DS212+ NAS running DSM 4.2-3211 (current version). I use it for several purposes including torrent download using Download Station and a tracker that needs authentication. My problem is that my download/upload ratio isn't updated, so it constantly falls down. My NAS is behind a router, and I configured the NAT to forward ports 6890 to 6999 to the internal IP address of the NAS. Here are the Download Station settings : TCP port : 6990, Sharing ratio : 900%, Sharing time : infinite, max download speed : 0 (no limit), max upload speed : 0 (no limit), BT protocol encryption : checked, max numbers of peers allowed by torrent file : 4000, DHT : checked, with port 6889. When the DHT option is not checked, the NAS doesn't upload any files. I don't know what is this option for. Can someone help me to solve this problem ? Did I miss any step, or does it come from the NAT ? How is the authentication managed by Dowload Station ? (Sorry for my english) Thanks.

    Read the article

  • apt-get for a package with when "contrib/source/Sources" is not found correctly

    - by Stuart Woodward
    I tried to install Webmin on Ubuntu by following the instructions on http://www.webmin.com/deb.html. Using the Update Manager GUI I added the repositories and the key deb http://download.webmin.com/download/repository sarge contrib deb http://webmin.mirror.somersettechsolutions.co.uk/repository sarge contrib However when I do an: apt-get update apt-get install webmin I get the error: W: Failed to fetch http://download.webmin.com/download/repository/dists/sarge/Release Unable to find expected entry 'contrib/source/Sources' in Release file (Wrong sources.list entry or malformed file) W: Failed to fetch http://webmin.mirror.somersettechsolutions.co.uk/repository/dists/sarge/Release Unable to find expected entry 'contrib/source/Sources' in Release file (Wrong sources.list entry or malformed file) Looing at the page at the URL I can see: 7029066c27ac6f5ef18d660d5741979a 20 contrib/source/Sources.gz Is the error caused by the fact that the Sources are compressed with gzip or am I doing something wrong?

    Read the article

  • Why is Nginx ignoring the access_log directive when post_action is specified?

    - by Chris
    Hi, in the location below nginx writes a custom download log. Everything works fine except when there is a post_action directive. I seems that nginx skips the access_log directive. Here is the config: location /download_intern/ { internal; if ($uri ~* ^/download_intern/([0-9]+)/) { set $transferID $1; set $server $arg_ip; set $url $arg_url; proxy_pass http://$server:80/$url; break; } log_format download '$remote_addr [$time_local] $upstream_cache_status "$scheme://$host$request_uri" $status [$transferID] $body_bytes_sent'; access_log /opt/nginx/logs/server.download_log download; # without this line the download log file is being written post_action /done; } location /done { internal; # log the transfer on the main server proxy_pass http://xxx.xxx.xxx.xxx:80/download_end/?tid=$transferID; }

    Read the article

  • Read NTFS partition on RHEL 5.8

    - by Alex Farber
    I have RHEL 5.8 64 bit, and NTFS partition on the same disk. How can I get access to this partition? This answer Unable to mount NTFS drive with RHEL 6 doesn't work for me: [root@localhost alex]# rpm -Uvh http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm Retrieving http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm error: skipping http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm - transfer failed - Unknown or unexpected error

    Read the article

  • How to enable .NET Framework 3.5 on Windows 8 without downloading it?

    - by Diogo
    Since I installed Windows 8 Preview on my personal computer, during the installation of some programs and drivers(Windows 7 ones) started to pop me a message warning that .Net Framework 3.5 was needed: I could use "Install this feature", start to download some dependencies(300MB) and that's it, but I don't want to have to download it every time I want to enable this feature on every machine that I install Windows 8. There is some way to install .Net 3.5 on Windows 8 without having to download the entire Framework from Microsoft?

    Read the article

  • How can I throttle the bandwidth consumed by Windows Automatic Updates?

    - by eleven81
    We have many Windows XP computers sharing one connection to the internet. These machines are set to download all available automatic updates and then prompt the user to install them. Whenever Patch Tuesday rolls around, our internet usage pegs out, and remains that way for most of the day, and sometimes into the following Wednesday. This hurts! I still want the machines to start to download the updates as soon as they are available, but if it takes until Thursday or Friday before the last updates are downloaded, that's still better than the latency and dropped connections we are seeing now as a result of the internet connection bottleneck. What can I do to throttle back how rapidly each machine downloads the updates, while still having them all start the download process as soon as the updates are available? I have no desire to run a WSUS server. Also, the internet connection is more than enough, whenever there are no updates to download.

    Read the article

  • Super slow opening my downloads folder

    - by Mark
    I have an exe file in my download folder that I half downloaded through utorrent (it's not piracy, a legit file from people who use bittorrent to distribute large files). I think I tried to open it while it was still sharing, that is, did not stop the upload. That actually froze my computer. When I restart in utorrent I set the file to be deleted. Unfortunately even though utorrent doesn't see that file anymore, it's still visible in my download folder. Whenever I try to open my download folder it literally takes 10 minutes or more. It opens, but is empty and the blue progress bar needs a long time to complete. After completion I can use the download folder normally, but opening and closing things in that folder takes a long time. I see the exe that I tried to download. I tried to delete it. But it was taking so long 30+ minutes that I eventually just hit cancel. That doesn't even work, and it was slowing down the computer. Couldn't figure out how to stop the delete so I just pulled the plug. Should I just forget about that dl folder and set a new one? Is there something I can do? Thanks.

    Read the article

  • How to bypass Forefront TMG for downloading from Adobe Cloud

    - by user1006272
    I hope that this question has not been asked as I've spent a couple of days googling around trying to find a solution. I have one computer that needs to download from Adobe Cloud to install applications like Photoshop etc... The issue I'm having is that Adobe uses a download manager program (AdobeApplicationManager.exe) that just keeps incrementing the time left on the download of any app like Photoshop. Is there a way to allow just the download manager from that one computer to bypass any filtering settings in Forefront TMG 2010? I have very little knowledge of servers / ISA servers / Forefront TMG and have been thrown into this position by luck I guess. Any help with this would be highly appreciated. Thanks in advance.

    Read the article

  • My internet speed became slow at night

    - by FrozenKing
    My internet plan is 512kbps unlimited and I get speed of average 64kbps but at night I used to get speed of 112kbps ..but recently my speed got normal like day time ...as per my view usually at night their is less traffic so I should get good speed like before ... Due to good speed I download and upload at night and my average download+upload per month is 60gb or 70gb... Is it that my ISP people putting restriction on my download and uploads.. I am confused.

    Read the article

  • How to circumvent ISP Limiting "Unknown" traffic - (SSH)Proxy, VPN

    - by connery
    I am having issues with using a proxy/VPN, with my current ISP (Comenersol, Spain). From my point of view they limit traffic by protocol or by traffic they "know" and "dont know". I'll explain my findings so far below. Internet connection in Spain: ~400-420KByte/sec (speedtest.net) OpenVPN Server in Sweden(pfsense): 100/100Mbit. LZO Compression. TCP. Tun. Aes128 Squid Proxy server in Sweden (pfsense): 100/100 (same box as the vpn server). Plain, no encryption. Runs in stealth mode to hide the use of proxy. NOT running OpenVPN or Squid Proxy, this is my findings: When I download a file from my pfsense box in Sweden, I get maximum speed When I run speedtest.net and choose any european server (including Swedish), I get max speed When I download a torrent (with non default port above 10K), I get limited to ~100KByte/sec. Encryption is turned off If I download something through https, I get max speed Running either Squid Proxy or VPN, this is my findings When I download a file from my pfsense box in Sweden, I get ~100KByte/sec When I run speedtest.net and choose any european server (including Swedish and Spanish), I get ~100Kbyte/sec When I download a torrent, I get same limitation ~100KByte/sec When I download something through https, I get ~100KByte/sec I verify the speeds above with speedtest.net measure, firefox measure in addition to having bmon running in terminal in the background. This way I am certain that the speeds I get presented, are in fact correct. If I connect through a different ISP with VPN or Squid Proxy, I get better speeds (400KByte/sec ++) In short: Whenever I tunnel my traffic through Sweden, my SPanish ISP throttles the traffic. I thought tunneling it through Squid would solve the issue, since I then would no longer hide my traffic through encryption. This does not seem to be the case. Wget and fetch gives same result. I did not try 'nc', but I assume this would give the same result. Does anyone know how to circumvent this issue? I would very much like to be able to get full speed with Swedish ip, as this would make me able to stream TV at higher quality than today. 100KByte/sec just does not cut it quality wise. Thanks for reading. Looking forward for your help.

    Read the article

< Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >