Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 853/1620 | < Previous Page | 849 850 851 852 853 854 855 856 857 858 859 860  | Next Page >

  • How can I perform a syntax check on an .htaccess file in a shared hosting environment?

    - by Danny
    I have a build script (Perl) that modifies the .htaccess file when I deploy my applications. As a double-check I'd like to be able to perform some sort of syntax checking on the created .htaccess file. I am familiar with the idea of using apachectl -t however, I am in a shared hosting environment and because of file access restrictions I cannot read certain configuration files specified by the sysadmins. Apachectl simply will not work in this regard. Ideas or suggestions welcome.

    Read the article

  • Glassfish V3 won't start

    - by Thierry
    I installed NetBeans 6.8 and tried to run the GlasshFish V3 server. I'm working under Windows Vista 32 Bits. First, it won't run. Then I modified the c:\Windows\System32\drivers\etc\hosts file and put the following line into it: 127.0.0.1 localhost And when I run the GlasshFish V3 Server, no error is showing but only "INFOs" are displayed: 3 avr. 2010 19:23:19 com.sun.enterprise.glassfish.bootstrap.ASMain main INFO: Launching GlassFish on Felix platform Welcome to Felix ================ INFO: Perform lazy SSL initialization for the listener 'http-listener-2' INFO: Starting Grizzly Framework 1.9.18-k - Sat Apr 03 19:23:24 CEST 2010 INFO: Starting Grizzly Framework 1.9.18-k - Sat Apr 03 19:23:25 CEST 2010 INFO: Grizzly Framework 1.9.18-k started in: 423ms listening on port 35127 INFO: GlassFish v3 (74.2) startup time : Felix(4456ms) startup services(1709ms) total(6165ms) INFO: Grizzly Framework 1.9.18-k started in: 459ms listening on port 35116 INFO: Grizzly Framework 1.9.18-k started in: 428ms listening on port 35155 INFO: Grizzly Framework 1.9.18-k started in: 470ms listening on port 35160 INFO: Grizzly Framework 1.9.18-k started in: 513ms listening on port 35159 INFO: javassist.util.proxy.ProxyFactory.classLoaderProvider = org.glassfish.weld.WeldActivator$GlassFishClassLoaderProvider@5be8f4 INFO: Hibernate Validator bean-validator-3.0-JBoss-4.0.2 INFO: Binding RMI port to *:35165 INFO: Instantiated an instance of org.hibernate.validator.engine.resolver.JPATraversableResolver. INFO: JMXStartupService: Started JMXConnector, JMXService URL = service:jmx:rmi://PC-de-Charlotte:35165/jndi/rmi://PC-de-Charlotte:35165/jmxrmi INFO: Using com.sun.enterprise.transaction.jts.JavaEETransactionManagerJTSDelegate as the delegate INFO: [Thread[GlassFish Kernel Main Thread,5,main]] started INFO: Grizzly Framework 1.9.18-k started in: 150ms listening on port 35159 INFO: Perform lazy SSL initialization for the listener 'http-listener-2' INFO: {felix.fileinstall.poll (ms) = 5000, felix.fileinstall.dir = C:\Program Files\sges-v3\glassfish\modules\autostart, felix.fileinstall.debug = 1, felix.fileinstall.bundles.new.start = true, felix.fileinstall.tmpdir = C:\Users\CHARLO~1\AppData\Local\Temp\fileinstall-330907148519261411, felix.fileinstall.filter = null} INFO: {felix.fileinstall.poll (ms) = 5000, felix.fileinstall.dir = C:\Users\Charlotte\.netbeans\6.8\GlassFish_v3\autodeploy\bundles, felix.fileinstall.debug = 1, felix.fileinstall.bundles.new.start = true, felix.fileinstall.tmpdir = C:\Users\CHARLO~1\AppData\Local\Temp\fileinstall-2938963288421854459, felix.fileinstall.filter = null} INFO: Grizzly Framework 1.9.18-k started in: 95ms listening on port 35160 INFO: Updating configuration from org.apache.felix.fileinstall-autodeploy-bundles.cfg INFO: Installed C:\Program Files\sges-v3\glassfish\modules\autostart\org.apache.felix.fileinstall-autodeploy-bundles.cfg INFO: {felix.fileinstall.poll (ms) = 5000, felix.fileinstall.dir = C:\Users\Charlotte\.netbeans\6.8\GlassFish_v3\autodeploy\bundles, felix.fileinstall.debug = 1, felix.fileinstall.bundles.new.start = true, felix.fileinstall.tmpdir = C:\Users\CHARLO~1\AppData\Local\Temp\fileinstall-6474085409014899009, felix.fileinstall.filter = null} And there is no message such as "Glassfish started"! So, when I try to access to the admin web interface: localhost:4848 or localhost:8080 or localhost:8181 , It doesn't work. What should I do? Thank you very much, Regards.

    Read the article

  • pptpd configuration

    - by Ian R.
    I would like a little help on configuring pptp so I can use my server as a vpn server since I have 10 ip's on it and I travel a lot so that would really help me and my partners. I managed to install everything needed but my vpn client fails to connect due to some reason that I cannot understand. I know there are 2 files in pptp that you're supposed to edit so I will post my 2 files here: /etc/ppp/pptpd-options name pptpd refuse-pap refuse-chap refuse-mschap require-mschap-v2 require-mppe-128 proxyarp nodefaultroute lock nobsdcomp /etc/pptpd.conf option /etc/ppp/pptpd-options logwtmp localip xx.158.177.231 remoteip xx.158.177.103,xx.158.177.116,xx.158.177.121,xx.158.177.124,xx.158.177.125,xx.158.177.131,xx.158.177.134,xx.158.177.139,xx.158.177.142,xx.158.177.145 interfaces file eth0 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.231 Bcast:xx.158.177.255 Mask:255.255.254.0 inet6 addr: xx80::216:3eff:fe51:31ba/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:56352 errors:0 dropped:0 overruns:0 frame:0 TX packets:3xx15 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:4884030 (4.8 MB) TX bytes:6780974 (6.7 MB) Interrupt:16 eth0:1 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.103 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:2 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.116 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:3 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.121 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:4 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.124 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:5 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.125 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:6 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.131 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:7 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.134 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:8 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.139 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:9 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.142 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 eth0:10 Link encap:Ethernet HWaddr 00:16:3e:51:31:ba inet addr:xx.158.177.145 Bcast:xx.158.177.255 Mask:255.255.254.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:16 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:3 errors:0 dropped:0 overruns:0 frame:0 TX packets:3 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:286 (286.0 B) TX bytes:286 (286.0 B)

    Read the article

  • File auditing software for Windows Server 2003

    - by David Collantes
    I am looking for a program or program suite that will allow the auditing of network shared resources (specifically storage space), and render reports (who created, deleted, moved, modified files, etc). Yes, I know I can turn on auditing on Windows, but the Event Log isn't quite the "charmer" for the job.

    Read the article

  • Authoritative sources about Database vs. Flatfile decision

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • Host timeout during file upload/download over SFTP/SSH

    - by kritop
    I tried different clients because i thought its client related, but all of them sooner or later disconnect or stop uploading/downloading files, and getting a timeout disconnect. After a reconnect it works again for a bit of time. Really strange cannot figure out the reason. I'm on a mac and the server is a debian VPS! If u need further informations ask please! I appreciate any tips, because i'm kinda stuck!

    Read the article

  • Can I right-click a folder in Windows 7 and choose "Open with Sublime Text"?

    - by George P. Burdell
    In Windows 7, I can do this at the command line: "c:\Program Files\Sublime Text 2\sublime_text.exe" samplefolder This launches a Sublime Text window, with the contents of samplefolder loaded in Sublime's sidebar. While this works as needed, it's inconvenient to open a command prompt every single time. Is there any way I can add this behavior to Windows Explorer's right-click menu? I'd like to be able to right-click a folder and "Open with Sublime" just like I can right-click a folder and "Scan for viruses".

    Read the article

  • Tar exclude not working?

    - by Andrew Fashion
    tar -cvf file.tar --exclude=thumbs/ \ --exclude=uploads_event/ \ --exclude=uploads_forum/ \ --exclude=uploads_admin/ \ --exclude=uploads_userpoints/ \ --exclude=uploads_group/ \ --exclude=up_old/ \ --exclude=uploads_user/ \ --exclude=uploads_wall/ \ directory_to_tar/ Do I have it wrong? Trying to tar entire directory but exclude all those directories and any files in those folders completely.

    Read the article

  • Monitor and copy file changes on Windows Server 2003 over NFS or CIFS to *nix

    - by davenolan
    Machine A, Windows Server 2003. Machine B, Ubuntu 9.04. Aim is to copy new and updated files as fast as possible from A to B. B can mount A either as CIFS or NFS (Services for Unix NFS server running on A). This is an absolutely time critical operation. What is the best way of achieving this? Note: benchmarked NFS vs CIFS and CIFS was faster and there was less variance in the speed (haven't tuned the NFS setup at all)

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Dell XPS Problem

    - by Alex B.
    Hi, Yesterday, my mom dropped her laptop to the floor and got the blue screen of death after trying to boot the system. I was able to start fedora live CD and get some of her stuff off the hard drive, but I cannot seem to be able to install Windows on it. The installation starts loading files and then the computer turns off. I am thinking that she might need a new hard drive. Any ideas?

    Read the article

  • Missing computer?

    - by Kravlin
    I'm trying to find a computer at work that we can't find the physical location of. There's no documentation in the inventory of it but it responds to nslookup, ping, and i can log onto it and edit it's files. However, we have no idea where in the building it is. Anyone have any good ideas for finding it outside of making it beep repetitively and annoying people while i run around looking for it?

    Read the article

  • What are these folders in the root of my drive?

    - by Max Schmeling
    The following folders keep showing up: C:\Default\ C:\NativeImages\ They both have a ton of folders in them witha bunch of html files in each folder. I'm assuming this has something to do with NGEN because of the NativeImages folder. How do I get rid of these folders and keep them from coming back? Or otherwise move them so they get put somewhere else? Thanks

    Read the article

  • Stop Search param in directories by grep immediately after param match

    - by yael
    hi friends I use the following command to find under /var some param in my script grep -R "param" /var/* 2/dev/null |grep -wq "param" my problem is that: after grep find the param in file grep continue to search until all searches under /var/* will completed How to perform stop immediately after grep match the param word For example when I run the: grep -R "param" /var/* 2/dev/null |grep -wq "param" grep find the param after one second. But grep continue to Search other same param on other files and its take almost 30 seconds How to stop the grep immediately after param match? THX

    Read the article

  • Overcoming maximum file path length restrictions in Windows

    - by Christopher Edwards
    One of our customers habitually use very long path names (several nested folders, with long names) and we routinely encounter "user education issues" in order to shorten the path to less than 260 characters. Is there a technical solution available, can we flick some sort of switch in Windows 7 and Windows 2008 R2 to say "yeah just ignore these historical problems, and make +260 character path name work". P.S. I have read and been totally unedified by Naming Files, Paths, and Namespaces

    Read the article

  • How to know the full path of a file using SSH?

    - by Roy
    Hi I am beginner for SSH stuff but i want to dump a big sql file and for that i need to be able to navigate to the appropriate path in my hosting account. I managed to login to SSH and i typed pwdbut it gave me a shared hosting pathway like /home/content/r/o/s/roshanjonah How Can i go to the path where i upload my files to...i use FTP but in FTP path it just shows / so i cannot go any further back than that...so using SSH how can i come to this path in ftp... Thanks Roshan

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Recommended Free Web Hosting Providers?

    - by Noah Goodrich
    I have been tasked with finding a low budget (translate free) web hosting service for a Facebook group and would like feedback from the professional community as to their experiences with free hosting providers in general, as well as any specific recommendations or evaluations of free (or very inexpensive) providers. Our specific needs for this project include: PHP 5 Support Mysql 5.0 Support Apache 2 Ability to use mod_rewrite in .htaccess files

    Read the article

  • How to distinguish between virtual disk image formats?

    - by Jakub Žitný
    There is huge number of different formats for virtual storage files for desktop and server purposes (vmdk, qcow2, vdi, vdk, etc.). I'm writing a little script for manipulating them and would like the script to be able to distinguish between them. Of course, it can be done via extension, but I want this to be more reliable. I tried commands file or qemu-img, but the results are not quite clear. Any idea improving my methods?

    Read the article

  • Ubuntu bash command

    - by pedro
    Hello i want to show the number of lines, words and characters of all configuration files "/ Etc / * conf" (command "wc"). How can i modify the command to not view the messages error.

    Read the article

  • recommendations for a PDF reader

    - by Tshepang
    Am looking for a PDF reader that can navigate PDF files like a web browser (IE, when I click a link that takes me to another part of the doc, I should be able to go back by pressing Ctrl+left arrow). I am using Debian.

    Read the article

< Previous Page | 849 850 851 852 853 854 855 856 857 858 859 860  | Next Page >