Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 220/389 | < Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >

  • Can't get Rewrite rule to keep original URL

    - by user38100
    I have these Rewrites, but I would like to have the URL stay the same as what is typed originally, I thought removing the [R] flags would stop it but it hasn't RewriteCond %{HTTP_HOST} ^examplea\.example\.com$ [NC] RewriteRule (.*) http://examplea.example.com:32400/web [L] RewriteCond %{HTTP_HOST} ^exampleb\.example\.com$ [NC] RewriteRule (.*) http://exampleb.example.com:9091 [L] Edit: would this work better? RewriteCond %{HTTP_HOST} ^hello.example.com$ RewriteRule ^(/)?$ welcome [L]

    Read the article

  • URL slugs: ideal length, and the real SEO effects of these slugs

    - by tattvamasi
    this question is addressed widely on SO and outside it, but for some reason, instead of taking it as a good load of great advice, all this information is confusing me. ** Problem ** I already had, on one of my sites, "prettified" urls. I had taken out the query strings, rewritten the URLS, and the link was short enough for me, but had a problem: the ID of the item or post in the URL isn't good for users. One of the users asked is there's a way to get rid of numbers, and I thought it was better for users to just see a clue of the page content in the URL. ** Solution ** With this in mind, I am trying with a section of the site.Armed with 301 redirects, some parsing work, and a lot of patience, I have added the URL slugs to some blog entries, and the slug of the URL reports the title of the article (something close to http://example.com/my-news/terribly-boring-and-long-url-that-replaces-the-number-I-liked-so-much/ ** Problems after Solution ** The problem, as I see it, is that now the URL of those blog articles is very descriptive for sure, but it is also impossible to remember. So, this brings me to the same issue I had with my previous problem: if numbers say nothing and can't be remembered, what's the use of these slugs? I prefer to see http://example.com/my-news/1/ than http://example.com/my-news/terribly-boring-and-long-url-that-replaces-the-number-I-liked-so-much/ To avoid forcing my user to memorize my URLS, I have added a script that finds the closest match to the URL you type, and redirects there. This is something I like, because the page now acts as a sort of little search engine, and users can play with the URLS to find articles. ** Open questions ** I still have some open questions, and don't seem to be able to find an answer, because answers tend to contradict one another. 1) How many characters should an URL ideally be long? I've read the magic number 115 and am sticking to that, but am not sure. 2) Is this really good for SEO? One of those blog articles I have redirected, with ID number in the URL and all, ranked second on Google. I've just found this question, and the answer seems to be consistent with what I think URL slug and SEO - structure (but see this other question with the opposite opinion) 3) To make a question with a specific example, would this URL risk to be penalized? Is it acceptable? Is it too long? StackOverflow seems to have comparably long URLs, but I'm not sure it's a winning strategy in my case. I just wanted to facilitate my users without running into Google's algorithms.

    Read the article

  • How to Do htaccess 301 Redirect from Old Filename Pattern to New Filename Pattern?

    - by user249493
    I have a bunch of old files prefixed with "old-" (e.g. "old-abcde.php"). I need an htaccess rule to set up a 301 redirect so that any request for a file starting with "old-" goes to its corresponding new version (e.g. "abcde.php"). To be clear, I have many files, not just one, so I can't do a literal filename match. I basically just need to strip off the "old-" from request and redirect to the version without it. I know I probably just need a simple regular expression, but I'm not good at writing them. Can anyone provide assistance?

    Read the article

  • Re-Route Mail to a port other than 25

    - by Ken
    Is there a way to route mail to another port? I have an email account attached to my laptop that I'd like to be able to send and receive mail from. Due to mobility, I'll be passing through various networks that will probably block this port. My dynamic DNS provider allows me to utilize web-forwards for MX domains; is this possible? where I can web forward to a domain:port which is managed by my DNS provider when I traverse between networks. If not, is there a way? Of course i could use web-mail or relay-forwarding from my home server, but that's not geeky enough.

    Read the article

  • How to make search engine to showing my location map? [duplicate]

    - by Lena Queen
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I have a webpage that already listing on google maps. If i search with term "my web", search engine (google) only show my web. How to make search engine (google) show my web and google location on right as this scenario: For contact and portfolio page, how to make it so user can view some of my page beside my home page? Also, how to make google show my other links of my web?

    Read the article

  • Massive 404 attack with non existent URLs. How to prevent this?

    - by tattvamasi
    The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is viewtopic.php, and I've also noticed a scary number of attempts to check if the site is a WordPress site (wp_admin) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, totally.me. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? EDIT I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: a honeypot a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could.

    Read the article

  • Is the W3 standard a major factor when google decides SERP position?

    - by Anonymous12345
    I have a dynamic php website which index only has around 800 errors according to the w3 validator online. I have tried checking major websites like ebay, stackoverflow and others also, all with around 400 errors. So my first thought is, what good is that validator when it always displays errors? Secondly, will the errors affect my SERP ranking? ie, will me fixing these errors as good as I can increase my Google search position? Thanks

    Read the article

  • Will not supporting IE or older browsers drive away potential visitors/users of my site? [closed]

    - by XToro
    Normally a SO browser but this question doesn't fit there, hopefully it fits here. I just want to ask from web designers' point of view if it's wrong to not care about supporting Internet Explorer or older browsers. The site I'm designing looks great in all browsers except IE9-. There are certain things that IE doesn't support or behave like other browsers; webkit stuff, some CSS styles, drop-and-drop files from OS etc etc, but it all works great in Safari, FireFox, Chrome etc. Should I be that concerned? I know there are several people that use IE, but it's limitations have just been causing me more work by having to come up with workarounds. From what I've read, many of the issues I've been having should be solved with IE10, but not everybody keeps up to date. I know of several people who are still using IE6! Again, I'm hoping this is the right place to ask a question like this, and if not, please point me to the right stack exchange site instead of just downvoting me. Thanks! EDIT: Upon further research.... So far this year, IE(all versions) and Chrome have been neck and neck as the top, with IE only squeaking by Chrome, and FireFox a close 3rd. But looking at the top 10 browsers, IE6 doesn't even show up on this list in which the lowest percentage is 1.92%. Source : http://www.w3counter.com/globalstats.php?year=2012&month=7 Having a look at this other site, IE6 shows up in 11th place out of 12, just before "Other" http://www.sitepoint.com/browser-trends-february-2012/ This makes me a little more wary of not spending more time on IE compatibility. However, my site will not be going to a live beta until October or November, and I'm hoping that IE10 will have more features coded into it. Currently, I've written my upload page which is a "drag-and-drop files from the OS" type to simply display "IE is not supported", leaving no other option for IE users to upload pictures because I've spent so much time writing the uploader which does many things other than just upload the files. I will be changing this kinda cold "Access Denied" to a suggestion to upgrade, or install other browsers, with download links for each. Big thanks for the posts here and the interesting links!

    Read the article

  • Good message board for a website (e.g. phpBB)

    - by unixman83
    Hi, What are the best (and most widely used) Linux-based message board softwares, and the pros and cons of each. e.g. Security Vulnerabilities, Performance on a cheap server, comes pre-packaged [RPM or DEB]. I am looking for the best message board software for my website. A VPS can run almost any software, so the sky is the limit! Free, doesn't require unreasonable number of hyperlinks to their website Security focused / Widely Used, vulnerabilities are found and fixed quick Easy to keep up-to-date, i.e. prepackaged / auto-update in some way Moderator features [like pinning / message preamble], account management Themeable, customize appearance a bit

    Read the article

  • simple sql group by custom groups question [migrated]

    - by alex
    imagine a mysql table that only has 2 columns, an id and a name of a color. with this query I know how many id's do I have for each color. SELECT color_name, count(id) FROM color_table GROUP BY (color_name); red:10 blue:5 yellow:3 green:1 my question is, is there a way I can specify to the "group by" some custom groups?? i mean, is there a query that results in this??: red:10 colors different than red: 9

    Read the article

  • How do I map some subdirectories to run alongside a Drupal site?

    - by paradroid
    I have a Drupal site running on Apache using the following vhosts file: <VirtualHost xx.xx.xx.xx:80> ServerName bananas.net ServerAlias www.bananas.net DocumentRoot /var/www/drupal/ RewriteEngine On RewriteCond %{HTTP_HOST} !=bananas.net [NC] RewriteRule ^(.*)$ http://bananas.net$1 [L,R=301] <Directory /var/www/bananas.net/> Options -Indexes FollowSymlinks AllowOverride All Order allow,deny Allow from all </Directory> CustomLog ${APACHE_LOG_DIR}/access.log combined ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost> I set it up some time ago, so I am not sure what the <Directory /var/www/bananas.net/> directive was meant for. That directory is currently empty. With the vhosts file the way it is, does the Directory directive have any effect at all? I want to add some content which is separate from the Drupal site. How do I add sub-directories within /var/www/bananas.net/ which can be accessed alongside the Drupal site running at the root? As they have nothing to do with the Drupal site, I want to keep the files separate, but still using the same domain.

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • Technical website - Should I assume that my visitors will use a modern browser?

    - by marco-fiset
    I am in the process of creating my own website, which will include a technical blog. I want to build my website using modern technologies such as HTML5 and CSS3. Since my website will be targetted at programmers and mostly tech-savvy users, should I take for granted that these people will be using a modern browser? Or should I make my site compatible with older browsers just in case? I don't want to go through the pain of adapting my website to be compatible with browsers I assume won't be used.

    Read the article

  • Bayesian content filter for vbulletin [on hold]

    - by mc0e
    I've been tasked with coming up with a tool to automatically flag some posts for moderator attention on a large vbulletin forum. It's not spam per se, but the task has a lot in common with the sort of handling that might be done by a spam protection plugin (a mod in vbulletin speak). There's only so much I can say, but the task does not involve bad users, so much as particular kinds of posts which the moderators need to be aware of. Filtering out user registrations and links is therefore not useful, and we are talking about posts by real human users. What I'm looking for is an existing bayesian classification plugin, or something that I can study to get an understanding of how to do the vbulletin side of the interface in order to build such a thing. Ie I'd need ways for moderators to list flagged posts, and to correct the classification of posts which have been mis-classified. Ideally I want a 3 way split with an "unsure" category in order to reduce what has to be reviewed to find any mis-classifications. Any pointers? I've searched around a bit, and so far what I've found has been more or less entirely targetted at intervening in sign-ups (mostly using stopforumspam), captchas, and use of external services like akismet which are spam specific. I'm also considering an external solution, which might be ableto be interfaced i

    Read the article

  • How to enable the user to add background images to anchor links thought Wordpress admin panel? [closed]

    - by janoChen
    I have css selectors like this on in my style.css: .jimgMenu ul li.landscapes a { background: url(../images/landscapes.jpg) repeat scroll 0%; } What's the easiest way to enable the user to add background images to anchor links like the ones below? front-page.php: <div class="jimgMenu"> <ul> <li class="landscapes"><a href="#nogo">Landscapes</a></li> <li class="people"><a href="#nogo">People</a></li> <li class="nature"><a href="#nogo">Nature</a></li> <li class="abstract"><a href="#nogo">Abstract</a></li> <li class="urban"><a href="#nogo">Urban</a></li> <li class="people2"><a href="#nogo">People</a></li> </ul> </div> To illustrate: .jimgMenu ul li.landscapes a { background: url(<add background image>) repeat scroll 0%; } What that code would look like?

    Read the article

  • Is the use of hashbang really a good idea? [on hold]

    - by user32642
    I've been working on a WordPress site lately that was design with hashbang or shebang in the dynamically generated URLs. After doing some research, I noticed that there was some preference by Google in their use and how it crawled the site. However, after I ran several sitemap generators and Screaming Frog SEO Spider, I realized that the only page being crawled was the index page. So now I am questioning the use of hashbangs. What do you think? Should I attempt to remove them? Or will it even matter? And does anyone know of a easy way to remove this? The site is www.modernvintage1005.com

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code) [duplicate]

    - by Alexander Farber
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 links displayed underneath: I believe these links are called "Google sitemap" and my question is how to configure them in Wordpress. Because while the right link is pointing to the /ueber-mich URL at the website, the left link was pointing to an non-existing /imprint and I had to add that webpage as a workaround for now. And I'd like to change the /imprint to German /impressum anyway (currently I use mod_rewrite to redirect). UPDATE: Dear downvoters and movers, would you mind to READ my question please? My question has been about how to configure Google sitemap links in Wordpress. So it is NOT A DUPLICATE (I do not want to edit the HTML code, I want to find the correct configuration in Wordexpress) and my question SHOULDN'T HAVE BEEN MOVED AWAY from wordexpress.stackexchange.com.

    Read the article

  • Do you know any independant keyword(phrase) statistics trend website?

    - by Sam
    Hi all, does anyone know an equally impressive service that shows the amount of times a specific keyword(phrase) has been searched, as well as a branch of other similar words? The one discussed in this video (Wordtracker.com) seems very good, but has gone commercial unfortunately which is not what Im looking for. I really would prefer free tool... http://www.youtube.com/watch?v=H2M1tXtAc18&feature=related Any suggestions for similar free online tools are very welcome. Thanks

    Read the article

  • Google Analytics: Custom variables issue difference in data

    - by Bart
    We’ve set up tracking through custom variables in Google Analytics to measure which offices are getting the most traffic. The custom var consists out of the key (=office) and value = (office name). In our Custom Var tab in audience we get no data (actually we got 1 hit, but we think the data is way off). When we setup advanced segments with the filters on key and value we get the correct data. Now we are wondering why we aren’t getting that data in the custom var tab.

    Read the article

  • i got mysql error on this statement i don't know why [closed]

    - by John Smiith
    i got mysql error on this statement i don't know why error is: #1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'CONSTRAINT fk_objet_code FOREIGN KEY (objet_code) REFERENCES objet(code) ) ENG' at line 6 sql code is CREATE TABLE IF NOT EXISTS `class` ( `numero` int(11) NOT NULL AUTO_INCREMENT, `type_class` varchar(100) DEFAULT NULL, `images` varchar(200) NOT NULL, PRIMARY KEY (`numero`) CONSTRAINT fk_objet_code FOREIGN KEY (objet_code) REFERENCES objet(code) ) ENGINE=InnoDB;;

    Read the article

  • Are the contents in the front page considered as duplicate of the post?

    - by yibe
    I asked this same question on stackoverflow, but closed being off topic. Therefore, I am posting it here. In Wordpress blogs, the front page of the blog will display many posts in whole or excerpts. When the link to the post is clicked, the content will be opened with an other template file(single.php). Can we say that the content displayed in the front page and the post pages are considered as duplicate? Does it harm SEO in any way?

    Read the article

  • Advantages of country TLD vs. .com

    - by Tschareck
    I want to get a domain for my site. The site's topic would be about Vienna, but the content will be in English. I was thinking, if I should get .com domain or .at domain. .at is both much cheaper and easier to get (there is less chance that my desired phrase is already registered). Is there any disadvantage in terms of SEO and page rank, if my domain does not end with .com? The site will be in English and targeted not just for Austria, but globally, mostly foreign tourists. I don't care if it's easy to remember the address, I expect most traffic to be from search engines anyway.

    Read the article

  • How do I get the root index page to redirect to a subdirectory without affecting SEO?

    - by paradroid
    I am reviving/reorganising my personal WordPress blog. It's using a URL that looks like this: http://mydomain.com/blog The webserver 301 redirects www.mydomain.com to mydomain.com. I want to use the blog subdirectory because I plan to add other parts to the site, with the blog only being one part of the site. However, at the moment there is nothing there but the blog, so I want to have the root index page redirect to the blog for the time being. I have been using this on the root index.html page to do the redirect... <meta http-equiv="REFRESH" content="0;url=./blog"></HEAD> ...but this seemed to have stopped the site being indexed by Google and Bing. How do I do this without affecting SEO? Also, what URL should I put in the sitemap.xml?

    Read the article

  • Best Text-to-Speech Solution for my Website [on hold]

    - by Tim Marshall
    I'm working on the 'Ease of Access' section of my website with the options to increase the font-size displayed on pages to a minimum, invert colours and whatnot. I wish to implement a plugin which, if enabled by the user, to read content on my website. Presumably my best option is a website plugin, however there might be some programming I've not come across which allows the likes of PHP to read content. I'm not entirely sure how this all works.

    Read the article

  • Why does Bing not show my adcenter ads though there is enough space

    - by gamma
    I created several campaigns using the MS adcenter. I'm targeting the whole world at any time with 2-3 placement texts per keyword group. The bids I placed are sometimes quite high, so they should get displayed. When I try to search for my keywords in Bing nothing gets displayed, though there is plenty of space for it. Bing mostly displays 2-3 ads, but the ones at the right side rather seldom. I'd like to know, how I can improve the fact that my ads are not being displayed - without increasing the bids any further.

    Read the article

< Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >