Search Results

Search found 14878 results on 596 pages for 'mod security'.

Page 73/596 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • Wordpress .htaccess preventing subfolder access

    - by John K.
    This is sort of a goofy setup, but it's not in my power to reconfigure it at this time. I'm running in a shared hosting environment. The domain is example.com. This is an add-on domain on the host side with example.com being redirected to the www/example.com sub-directory. That directory houses a standard Wordpress site which acts as the main site when you visit example.com. The .htaccess file within that directory is: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^wp-admin/profile\.php$ /ssm/welcome [R] </IfModule> I have a subdirectory, at the root level with the /example.com subdirectory that houses a cake php application. That subdirectory is /tracker. My problem is that when I attempt to browse to example.com/tracker, I get a 404 from Wordpress because perma links are on. What I think I need is a rewrite rule in the Wordpress .htaccess file that short circuits the existing rewrite rules and permits example.com/tracker to work independently of the Wordpress install. Or a rewrite rule at the root level that short circuits the redirect to the /example.com directory in the first place. Not sure how well I explained that so here's a summary. The www/ directory structure: example.com/ tracker/ Add on domain of www.example.com redirecting to the /example.com directory with Wordpress and a tracker/ directory running CakePHP which I would like to access via www.example.com/tracker. If you need further info or clarification let me know!

    Read the article

  • Using PHP Redirect Script together with Custom Fields (WordPress)? [on hold]

    - by Alex Scherer
    I am currently trying to make yoast's link cloaking script ( Yoast.com script manual // Github Script files ) work together with the Wordpress plugin Advanced Custom Fields. The script fetches 2 values (redirect id, redirect url) via GET and then redirects to this particular URL which is defined in a .txt file called redirects.txt I would like to change the script, so that I can define both the id and redirection URL via custom fields on each post in my WP dashboard.. I would be really happy if someone could help me to code something that does the same as the script above but without using a redirects.txt file to save the values but furthermore gets those values from custom fields. Best regards ! Alex

    Read the article

  • Force www. on multi domain site and retain http or https

    - by John Isaacks
    I am using CakePHP which already contains an .htaccess file that looks like: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ app/webroot/ [L] RewriteRule (.*) app/webroot/$1 [L] </IfModule> I want to force www. (unless it is a subdomain) to avoid duplicate content penalties. It needs to retain http or https Also This application will have multiple domains pointing to it. So the code needs to be able to work with any domain.

    Read the article

  • Apt-get take long time to update\upgrade

    - by ShockwaveNN
    On my work network any apt-get (or aptitude) commands take a very long time, it's look's like admins blocked some port for it (for unknown reason). For example sudo apt-get update take like 2 days and all I get - a very long list of responses like Get: 36 http://security.ubuntu.com precise-security/universe amd64 Packages [11.6 kB] Get: 37 http://security.ubuntu.com precise-security/universe amd64 Packages [11.6 kB] Get: 38 http://security.ubuntu.com precise-security/universe amd64 Packages [11.6 kB] Get: 39 http://security.ubuntu.com precise-security/universe amd64 Packages [11.6 kB] Get: 40 http://security.ubuntu.com precise-security/universe amd64 Packages [11.6 kB] Same situation then I try to download software Get:1 http://archive.ubuntu.com/ubuntu/ precise/main dash i386 0.5.7-2ubuntu2 [85.8 kB] Get:2 http://archive.ubuntu.com/ubuntu/ precise/main dash i386 0.5.7-2ubuntu2 [85.8 kB] Get:3 http://archive.ubuntu.com/ubuntu/ precise/main dash i386 0.5.7-2ubuntu2 [85.8 kB] Get:4 http://archive.ubuntu.com/ubuntu/ precise/main dash i386 0.5.7-2ubuntu2 [85.8 kB] Get:5 http://archive.ubuntu.com/ubuntu/ precise/main dash i386 0.5.7-2ubuntu2 [85.8 kB] Is there something I can do to change port for apt-get or something else

    Read the article

  • Does purposely linking to an invalid URL and then using 301 affect SEO?

    - by Mike
    On a section of my site, I am currently using .htaccess rewrites to put the ID as part of the URL instead of in the query, like so: RewriteRule ^([a-z_]+)?/?tours/([0-9]+)/(.*) /tours/tour_text.php?lang=$1&id=$2&urlstr=$3 [L] For example, if someone goes to /en/tours/12/some-text-here it will rewrite it to /tours/tour_text.php?lang=en&id=12&urlstr=some-text-here. However I don't want the users to be able to put just any text, so if they type in the wrong some-text-here part it will 301 redirect them to the right page. This works perfectly, but I can see a potential problem potential arising when localizing the website, so I just wanted to make sure it's not actually a problem. How it is now, if someone goes to /en/tours/12/some-text-here, the anchor to the Spanish version of that page will be /es/tours/12/some-text-here (i.e. only changing the "en" to "es"), and then the script will then 301 them to the correct Spanish text (something like /es/tours/12/algun-texto-aqui). And the reverse will also be the same. The anchor on the Spanish version to the English version would be /en/tours/12/algun-texto-aqui and then they will be forwarded with 301 back to /en/tours/12/some-text-here. Basically, the anchor changes the language and the 301 changes the string at the end. So I have two questions: Does purposely and permanently having invalid URLs on your site that get 301'ed to the correct ones have any effect on SEO? I could make it just show the correct URL to begin with, but this is a significant amount of work due to how I am handling the translations, so I would prefer just to 301 them. Will the invalid URLs that are contained in the links be added to the search engine indexes even if they get 301'ed to another page?

    Read the article

  • Are programming languages perfect?

    - by mohabitar
    I'm not sure if I'm being naive, as I'm still a student, but a curious question came to my mind. In another thread here, a user stated that in order to protect against piracy of your software, you must have perfect software. So is it possible to have perfect software? This is an extremely silly hypothetical situation, but if you were to gather the most talented and gifted programmers in the world and have them spend years trying to create 'perfect' software, could they be successful? Could it be that not a single exploitable bug could be created? Or are there flaws in programming languages that can still, no matter how hard you try, cause bugs that allow your program to be hijacked? As you can tell, I know nothing about security, but essentially what I'm asking is: is the reason why software is easily exploitable the fact that imperfect human beings create it, or that imperfect programming languages are being used?

    Read the article

  • Why is this by passing the SUDO password?

    - by John Isaacks
    I have a bash script I am using to automate a SVN checkout. The contents of the file were: #!/bin/bash cd /var/www-cake sudo svn checkout file:///usr/local/svn/bash_repo/repo/ Then when I double click the file it would ask me what to do, I would click the button "Run In Terminal" and then a terminal would pop up and ask me for the SUDO password. I would enter it, the script would execute and the terminal would close. I wanted to give some sort of indication that the script ran successfully so I edited my file to look like: #!/bin/bash cd /var/www-cake sudo svn checkout file:///usr/local/svn/bash_repo/repo/ echo "Head revision has been pushed to live server" I expected the terminal to now stay open and tell me the message afterwards. To my surprise it now opens and immediately closes. The script does execute and I no longer have to put in the SUDO password. Is this right? I do not understand why this is happening, seems like a security issue.

    Read the article

  • How to learn PHP effectively?

    - by Goma
    A dozen of bad tutorials out there that teach you bad habits especially when we speak about PHP. I want to learn how to avoid the things that can lead me to develop inefficient web applications. I like to learn from videos but most videos I've found on the internet are provided by people who do not follow good practices. My second option is to learn from books but I did not find a good book for starters in PHP! It would be very helpful for me if you can tell me about your story in learning PHP, what are things that I should avoid? How to learn about PHP security from the beginning to avoid unlearn something later on?. Please provide links to books, websites that provide high quality video tutorials for PHP, and you tips for a good start!

    Read the article

  • Where I missed boot.properties.?

    - by Dyade, Shailesh M
    Today one of my customer was trying to start the WebLogic Server ( Production Instance) , though he was trying to start the server in a standard way, but it was failing due to below error :   ####<Oct 22, 2012 12:14:43 PM BST> <Warning> <Security> <BanifB1> <> <main> <> <> <> <1350904483998> <BEA-090066> <Problem handling boot identity. The following exception was generated: weblogic.security.internal.encryption.EncryptionServiceException: weblogic.security.internal.encryption.EncryptionServiceException: [Security:090219]Error decrypting Secret Key java.security.ProviderException: setSeed() failed> And it started failing into below causes. ####<Oct 22, 2012 12:16:45 PM BST> <Critical> <WebLogicServer> <BanifB1> <AdminServer> <main> <<WLS Kernel>> <> <> <1350904605837> <BEA-000386> <Server subsystem failed. Reason: java.lang.AssertionError: java.lang.reflect.InvocationTargetException java.lang.AssertionError: java.lang.reflect.InvocationTargetException weblogic.security.internal.encryption.EncryptionServiceException: weblogic.security.internal.encryption.EncryptionServiceException: [Security:090219]Error decrypting Secret Key java.security.ProviderException: setSeed() failed weblogic.security.internal.encryption.EncryptionServiceException: [Security:090219]Error decrypting Secret Key java.security.ProviderException: setSeed() failed at weblogic.security.internal.encryption.JSafeSecretKeyEncryptor.decryptSecretKey(JSafeSecretKeyEncryptor.java:121) Customer was facing this issue without any changes in the system, it was stable suddenly started seeing this issue last night. When we checked, customer was manually entering the username and password, config.xml had the entries encrypted However when verified, customer had the boot.properties at the Servers/AdminServer/security folder and DomainName/security didn't have this file. Adding boot.properies fixed the issue. Regards Shailesh Dyade 

    Read the article

  • url mod_rewrite

    - by Pritam Borkar
    I had an e-commerce website hosted on http://mydomain.com/beta for more than a year, eventually I decided to move the website to root http://mydomain.com I had done quite a lot of link postings to forums etc, when my site used to be hosted in the sub-dir /beta . Is there any way to do a mod_rewrite by which all the old links that I have posted do not return as broken links since now longer the site is hosted in /beta and is now hosted on the site root. I did read that mod_rewrite can help resolve this issue, but also read about that this has to be done with care. Just a tip that this site is using Friendl URL.

    Read the article

  • To Fix HTTP 400-499 error codes with 301 redirects in .htaccess file

    - by user2131844
    Google previously indexed my websites pages (sitemap.xml) with below format: www.domain.com/2013/04/18/hot?test-gadgets-of-2013-to-include-in-?your-list www.domain.com/2013/02/09/rin?gdroid I have resubmitted the sitemap but there are still 404 errors in Google/Bing engine. Could you please help me to write 301 redirects rule in .htaccess file so when some clicks the URL for: www.domain.com/2013/02/09/rin?gdroid They should be redirected to: www.domain.com/rin?gdroid How we can write rule in .htaccess file to remove date part 2013/02/09/?

    Read the article

  • 301 redirects mirrored domain

    - by Dave
    I'm redesigning a site for a friend on my localhost. His old site is an .asp based site and we're replacing it with a WordPress site on LAMP hosting. The old site sits on domain A and also has another domain, domain B parked on top of it mirroring it. Google has picked up domain B for most of his search engine results and yahoo and bing etc have picked up domain A. The plan is to 301 redirect the the old pages of his site on domain A to the new WordPress versions and park domain B on top of it like before. My question is, will this work, if not what would be a better way to approach it? We'd prefer not to lose any of the search engine listings in the redesign, and the search engines don't appear to have penalized him for duplicate content. Thanks very much in advance!

    Read the article

  • Rewriting a URL for tomcat through an ajp connection

    - by StudentKen
    I've tried several attempts to resolve this, but all have come up naught. Currently I have apache setup to forward all urls at and past the /portal/ tag to tomcat. Unfortunately, tomcat receives these requests through /portal/appName, a subdirectory in webapps rather than the webapps root directory where my wars are deployed. Is there a simple solution to this that I'm not seeing? I've been trying to use mod_rewrite to ^/portal/ $ / but that doesn't yield the expected results (perhaps I'm doing this wrong?).

    Read the article

  • How to Remove Extensions From, and Force the Trailing Slash at the End of URLs?

    - by Kronbernkzion
    Example of current file structure: example.com/foo.php example.com/bar.html example.com/directory/ example.com/directory/foo.php example.com/directory/bar.html example.com/cgi-bin/directory/foo.cgi I would like to remove HTML, PHP and CGI extensions from, and then force the trailing slash at the end of URLs. So, it could look like this: example.com/foo/ example.com/bar/ example.com/directory/ example.com/directory/foo/ example.com/directory/bar/ example.com/cgi-bin/directory/foo/ I am very frustrated because I've searched for 17 hours straight for solution and visited more than a few hundred pages on various blogs and forums. I'm not joking. So I think I've done my research. Here is the code that sits in my .htaccess file right now: RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(([^/]+/)*[^./]+)/$ $1.html RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !(\.[a-zA-Z0-9]|/)$ RewriteRule (.*)$ /$1/ [R=301,L] As you can see, this code only removes .html (and I'm not very happy with it because I think it could be done a lot simpler). I can remove the extension from PHP files when I rename them to .html through .htaccess, but that's not what I want. I want to remove it straight. This is the first thing I don't know how to do. The second thing is actually very annoying. My .htaccess file with code above, adds .html/ to every string entered after example.com/directory/foo/. So if I enter example.com/directory/foo/bar (obviously /bar doesn't exist since foo is a file), instead of just displaying message that page is not found, it converts it to example.com/directory/foo/bar.html/, then searches for a file for a few seconds and then displays the not found message. This, of course, is bad behavior. So, once again, I need the code in .htaccess to do the following things: Remove .html extension Remove .php extension Remove .cgi extension Force the trailing slash at the end of URLs Requests should behave correctly (no adding trailing slashes or extensions to strings if file or directory doesn't exist on server) Code should be as simple as possible I would very much appreciate any help. And to first person that gives me the solution, I'll send two $50 iTunes Store gift cards for US store. If this offends anyone, I am truly sorry and I apologize. Thanks in advance.

    Read the article

  • How can I test for a URLs existeance before redirecting to it?

    - by ckliborn
    I am using Apache's mod_rewrite to redirect mobile users to my mobile site based on their http_user_agent. However not all pages have a mobile equivalent. Also mobile pages end in .html and "full" pages end in .shtml. Here is some pseudo code. Does the user have a certain HTTP_USER_AGENT? Is there a mobile page? If so take them there. If not, no redirection is needed. I want to do this with apache.

    Read the article

  • Opensource package for securly allowing users to log in and provide information

    - by JTS
    I have a site written in mostly php and html. I also have a sql database of personal information like names and addresses. I would like my users to be able to log in to my website with a login I can email or snail mail to them, and view and edit their information on my database. Users can currently enter information online I and store it in my database but they can't view or edit stored information. I can add the code to do this, but when I give users the ability to view information I suddenly have a lot more security concerns. Is there an open source package to deal with allowing users to do something like this? Or is there an established convention for this? I know this is a pretty basic question, and there might be some good literature about it that I have yet to find, so if someone can just point me in the direction of some of that information, or better yet give me firsthand some information about this that would be great.

    Read the article

  • Go up one directory in mod_rewrite

    - by Rudolph Gottesheim
    I've got a standard Zend Framework 1 project that looks a bit like this: Project |- public |- .htaccess |- index.php The .htaccess looks like this: RewriteEngine On RewriteBase / RewriteRule ^image/.*$ img.php?file=$1 [NC,L] RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] RewriteRule ^.*$ index.php [NC,L] Now I want to start transitioning the site to Zend Framework 2, which I put in a separate directory in the root, so the whole thing looks like this: Project |- public |- .htaccess |- index.php |- zf2 |- public |- .htaccess |- index.php What would I have to change in my original (ZF1) .htaccess to route all requests to (for example) /zf2/whatever to ZF2's index.php? I've tried RewriteRule ^zf2(/.*)$ ../zf2/public/index.php [NC,L] in the line after RewriteBase /, but that just gives me a 400 Bad Request.

    Read the article

  • How can I redirect everything but the index as 410?

    - by Mikko Saari
    Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new index.html. Here's what in the .htaccess: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\.html$ index.html [L,R=410] This works, except for one thing: If I type the domain name, I get the 410 page. With www.example.com/index.html I see the index page as I should, but just www.example.com gets 410. How could I fix this?

    Read the article

  • Switch to https

    - by Mike
    I'm looking to use an .htaccess file to use mod_rewrite to switch the protocol from http:// to https:// when someone hits my website. For instance, once someone goes to: http://www.mywebsite.com/ I'd like the browser to switch to: http*s*://www.mywebsite.com/ The same goes for the http://mywebsite.com/ - https://mywebsite.com This is the following code I've been using and I've experienced some odd things so if anyone could provide me with information if this is the right way to do it, or if you have a better way, please provide it. Thanks in advance. RewriteEngine On RewriteCond %{SERVER_PORT} !=443 RewriteRule ^(.*)$ https://www.ebaillv.com/$1 [R=301,L]

    Read the article

  • Is there any good reason I would want my website to be framed?

    - by minitech
    I'm building a website that's not security-critical in any way at all, so having somebody put a page in an <iframe> is not particularly dangerous to its users. However, as my website doesn't have script plugins that will be used anywhere else, is there any reason why I shouldn't just apply: X-Frame-Options: Deny to every page on my website? Is there any valid reason for any other website to embed mine? I've seen plenty of content-stealing ones and attempts to hijack user accounts, but never an actual good usage of frames that's not an explicit feature of the website.

    Read the article

  • help redirecting IP address

    - by Alice
    Google has indexed the IP address of my site rather than the domain, so now I'm trying to set up a 301 redirect that will redirect the IP address and all subsequent pages to the domain. I currently have something like this in my .htaccess file (however don't think it's working correctly?): RewriteCond %{HTTP_HOST} ^12.34.567.890 RewriteRule (.*) (domain address)/$1 [R=301,L] I've used various redirect checker tools and keep getting the message: "... not redirecting to any URL or the redirect is NOT SEARCH ENGINE FRIENDLY" Am I doing something wrong or is there something else I should be trying? Thanks! Alice

    Read the article

  • Getting the masked URL values in Mediawiki

    - by Kalai
    I have successfully masked the URL in Mediawiki. By using the following scripts in .htaccess and localsettings.php files in Mediawiki, i.e.: .htaccess: Options +FollowSymLinks RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)/(.*)$ /mediawiki/index.php?title=$1&actions=$2 [L] Localsettings.php: $wgScriptPath = "/lib/mediawiki"; $wgArticlePath = "/lib/mediawiki/$1/$2"; It is working fine with required URL. But my problem is I want to consider the second parameter as a querystring for my pages. But I could not get the second parameter in my file. I tried with $wgrequest function but it is only giving the first parameter as title. I tried with $_REQUEST also, it is sometimes give the value of $_REQUEST['actions']. But many times not. I cant understand what is the problem.

    Read the article

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • Infinite redirect loop in cpanel-purchased script

    - by lital maatuk
    I am installing a script (that I bought on cpanel) in the root directory of my web site. When I try to install it, it gives me an error. I found that it starts an infinite URL redirect loop containing the name of my web site. Something like: install//mywebsite.com/install=mywebsite.com/install=mywebsite.com/install=mywebsite.com etc. until the browser refuses to continue when URL gets too long. The vendor told me I need to have mod_rewrite installed on my cpanel and something about .htaccess. How do I do fix this?

    Read the article

  • Using mod_speling with multi-level htaccess and rewriterules

    - by michaelcgorman
    We recently switched formats for managing our 301s. For the most part, everything went well, but it seems to have stopped mod_speling from working properly. Here's what we changed: old /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] Redirect 301 /academics/calendar2012-13.html http://www.example.edu/academics/calendar.html Redirect 301 /academics/departments/ http://www.example.edu/majors/ Redirect 301 /academics/Pre-Medical.pdf http://www.example.edu/academics/Pre-Medicine.pdf Redirect 301 ... new /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*) 404/$1 And then we added a new file at /var/www/html/404/.htaccess: RewriteEngine on RewriteBase /404 RewriteRule ^academics/calendar2012-13.html$ /academics/calendar.html [R=302,L] RewriteRule ^academics/departments/$ /majors/ [R=301,L] RewriteRule ^academics/Pre-Medical.pdf$ /academics/Pre-Medicine.pdf[R=301,L] RewriteRule ... I do have (Webmin-based) access to the httpd.conf (though we don't want to store all our 301s there, if possible). We're running Apache 2.2.15 on RHEL 6 on a server in our own data center. Like I said, the only problem we're seeing is that mod_speling isn't doing its magic anymore. The new format has so many advantages over the old that we really don't want to go back, but mod_speling is so nice to have that we'd also really like it to work if possible. Any ideas for how we might be able to fix mod_speling?

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >