Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 193/389 | < Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >

  • Need some help on tomcat URL mod_rewrite or mod_jk

    - by Redbull Fan
    I am trying to remove the context name from the url of my server. Current URL - http://www.domainname.com/MyApp/ What I need to make is to make it avaialble at - www.domainname.com/ So it is only going to host one main app and that needs to be displayed when we open www.domainname.com/ on browser. I have already tried couple of things like below - RewriteEngine On RewriteCond %{REQUEST_URI} !^/(Context/.*)$ RewriteRule ^/(.*)$ /Context/$1 [P,L] OR redirect permanent /MyApp/ abcd://domainname.com OR Using JKMount - JkMount /MyApp/* ajp13 JkMount /MyApp* ajp13 OR Deploy war file to ROOT of tomcat and make relevant chagnes in web and server.xml All of these aren't working and I keep getting a intenal error. I need a way to basically trim the tomcat URL to make short. Thanks, Andy

    Read the article

  • How do I access column data in a previous select statement from a sub-query? [closed]

    - by payling
    PROBLEM How do I access column data in a previous select statement from a sub-query? Below is a simple mock up of what I'm attempting to do. Tables used: Quotes, Users QUOTES TABLE qid, (quote id) owner_uid, creator_uid SQL SYNTAX: SELECT q.qid, q.owner_uid, q.creator_uid, owner.fname, owner.lname FROM quotes q, (SELECT u.fname, u.lname FROM users u WHERE u.uid = q.owner_uid) AS owner WHERE q.qid = '#' SUMMARY I want to be able to use the quote table's owner_uid and specify it for the owner table so I can return all the owner info for that particular quote. The problem is, q.owner_uid is not recognized in the owner sub-query. What am I doing wrong?

    Read the article

  • rel="Canonical": Ranking Benefits ? & specifying for PDF?

    - by Miak
    I think I understand the basic case for using rel="canonical": to tell google which is the preferred URI when the same page/content may be accessed via more than one URI. This helps you avoid duplicate content penalties. But what else does it do? Does it also affect search ranking? i.e. will the page I specify in the canonical be ranked higher than the others? (if all else equal). And in the case of PDF documents, I understand that you can now specify rel="canonical" for them too, using HTTP headers (i.e. in htaccess). Again, this would obviously help avoid dupilcate content penalties if the PDF content is the same as the HTML page or if it can be accessed in more than one place. But does it affect ranking? or are there any other benefits to doing this.

    Read the article

  • SEO and multiple domains to same site

    - by mwb
    I have one website. I have two domain names that I want to point to the same site install. So whether you go to name-one.com or name-two.com you see the exact same site. Now, I can either set up name-two.com to serve 301 redirect header redirecting to name-one.com – or, I can set up name-two.com as a CNAME in the DNS pointing to name-one.com What is the different implications for SEO on this? What is recommended? I would guess it's better for branding to use a 301 redirect, so that visitors will see one consistent url for my site, right? The reason I want the two domains is that I want a version with regional letters ('ö' instead of 'oe' ) in the name.

    Read the article

  • Google Analytics - Traffic Source - Search engine - (Not Provided)

    - by Dharmavir
    I am using Google Analytics, now here when I go to "Traffic Source Overview" under that it shows Keyword as "(Not provided)" which is almost 40% of my traffic source. Now more than 90% of search engine traffic is from Google and still out of that for more than 40% of keywords are "(Not provided)". Can anyone explain me what is going wrong here or how can I get that data? Because that comes as 1st option and is biggest keyword in the list. Will that be some crawler or secure google search?

    Read the article

  • Is browser and bot whitelisting a practical approach?

    - by Sn3akyP3t3
    With blacklisting it takes plenty of time to monitor events to uncover undesirable behavior and then taking corrective action. I would like to avoid that daily drudgery if possible. I'm thinking whitelisting would be the answer, but I'm unsure if that is a wise approach due to the nature of deny all, allow only a few. Eventually someone out there will be blocked unintentionally is my fear. Even so, whitelisting would also block plenty of undesired traffic to pay per use items such as the Google Custom Search API as well as preserve bandwidth and my sanity. I'm not running Apache, but the idea would be the same I'm assuming. I would essentially be depending on the User Agent identifier to determine who is allowed to visit. I've tried to take into account for accessibility because some web browsers are more geared for those with disabilities although I'm not aware of any specific ones at the moment. The need to not depend on whitelisting alone to keep the site away from harm is fully understood. Other means to protect the site still need to be in place. I intend to have a honeypot, checkbox CAPTCHA, use of OWASP ESAPI, and blacklisting previous known bad IP addresses.

    Read the article

  • How do I redirect a FQDN to an internal URL?

    - by Dave
    We have internal DNS servers where we've registered a FQDN that resolves internally to identityreg.domain.com. We also have an existing web page at https://iamserver.domain.com/product/default.asp?Workflow=process1. We need our users to be redirected to the existing web page URL whenever they type identityreg.domain.com. We're using IIS for the web server. I'm a newbie here so forgive any misuse of terms. How do I get the FQDN to redirect to the URL?

    Read the article

  • What is best practice for search engines when a website is under maintenance?

    - by jamescridland
    I need around a week to transition a heavily data-driven website from one back end to another. During that time I do plan to attempt to keep some pages live, but they won't all work well or look brilliant. Some pages won't work at all. What is the best way to ensure I don't scare Google? Should I hide everything from robots.txt, or mark everything that doesn't work as "503", or are there other things that I should be considering?

    Read the article

  • How to redirect user into mobile website?

    - by iLa
    Hi how to redirect the user into mobile site when the user accessing from mobile. Say example i have site called www.mysite.com. Now, a person accessing a website from mobile it should redirect to www.mysite.com/mobile or www.m.mysite.com. I put some research in google that we can redirect using javascript to get the user agent(browser) if(mobile browser) { //redirect to www.mysite.com/mobile } else if(normal browser) { //redirect to www.mysite.com } or using screen resolution if(screen resolution < 800 ) { //redirect to www.mysite.com/mobile } else if(screen resolution > 800) { //redirect to www.mysite.com } I think It will not work If it is the case of javascript disable. Can we do this using .htaccess or php stuff? Is there any standard mechanism to do this?

    Read the article

  • Proper password handling for login

    - by piers
    I have read a lot about PHP login security recently, but many questions on Stack Overflow regarding security are outdated. I understand bcrypt is one of the best ways of hashing passwords today. However, for my site, I believe sha512 will do very well, at least to begin with. (I mean bcrypt is for bigger sites, sites that require high security, right?) I´m also wonder about salting. Is it necessary for every password to have its own unique salt? Should I have one field for the salt and one for the password in my database table? What would be a decent salt today? Should I join the username together with the password and add a random word/letter/special character combination to it? Thanks for your help!

    Read the article

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • Should I submit my RSS feed as Google Sitemap?

    - by Svish
    I currently have no sitemap for a website I'm creating. I do have an RSS feed which includes the N latest updated posts on the site. It doesn't include everything on the site though, just blog posts. Creating a full sitemap would be a bit of a hassle I think. Should I submit the feed instead? Is there a difference between using a regular sitemap and a feed? Is it important to have a sitemap? What happens when you do/don't?

    Read the article

  • Hosting files with support for file tagging / keywords

    - by Zev Chonoles
    I have a large (approx. 25GB) collection of files I would like to host online for people to view or download. I have a spare computer I can use as a dedicated server for these files. I'm looking for a method of, or piece of software for, hosting my files where I can assign tags or keywords to the files, and people viewing my files online can search the collection via the tags. By way of approximate solutions I've found so far, I see that there is software such as Collectorz.com or Readerware for creating databases of one's books / music / movies, and these databases can be searched by tags or keywords, and the databases can be made available and searchable online; this would suit my purposes except that my files are not necessarily books, music, or movies, and I want the files themselves accessible online, not a database describing my files. A commercially-available solution like the ones above would be acceptable, but I'd prefer to have the whole setup under my control (i.e. I'd like to either implement it by hand, or use commercial software that doesn't rely on using the company's servers, paying them a continued fee, etc.). The current extent of my internet experience is designing a few Google Sites, so I know there's a fair chance I won't understand the answers I receive, but I'm always happy to have a summer project :)

    Read the article

  • Redisigning an old site, structure change etc

    - by RhymeGuy
    I have an old site built in 2006, it has around 200 pages and 500 pictures. Every single page is of course indexed as well as images. It is very well ranked for targeted keywords and I receive good amount of SEO traffic (I guess that's due the various campaigns, branding, ppc, etc..) Problem: Site has outdated design, pages and images have not so proper names, there are no heading and alt tags, it was built in tables, inline CSS etc.. Goal: Complete redisign site, use divs, change file names, add proper meta data, alt tags etc.. Question: How this can affect current SEO positions? I will redirect (301) every single page to the new one, build site map, but what to do with images? Do I need to redirect them also? Any other suggestion?

    Read the article

  • Weird Results A/B Test in Google Website Optimizer

    - by Yisroel
    I set up a test in Google Website Optimizer that has a 3 variations - original (A), B, and C. In order to further validate the results of the test, I added a variation C that is exactly the same as the original. And thats where the results get weird. 6 days into the test, the best performing variation is C. It outperforms the original by 18.4%! How is that possible? Do I now discount the results of this test entirely?

    Read the article

  • How to ask and acceptalbe question? [closed]

    - by Richard Seitz
    My wife and I own a small art business for the purpose of selling painting. These are paintings done by my wife. We feel because of demand and comments that we should now consider the web to sell our art. How do we start? We already have a registered domain name. We have talked to many people that say they are experts in the design and hosting of webs. What should we ask the potential designer of our web to determine if he/she are professional or just thieves. What does one ask to evaluate the competence and integrity of a web designer? Thank You

    Read the article

  • Setting up a Google Analytics Campaign

    - by Ashfame
    I will be doing a bunch of things to give one of my projects (main app) a big initial push for which I will be building a few small Facebook apps which will help in promoting the main apps. Traffic from these apps need to be tracked individually. My main app will be posting on the walls when the user needs to be notified. Traffic from these posts need to be tracked. Traffic from emails sent by the main app need to be tracked, like different types of email. I need to track all of these & possibly a couple of more but I need to be sure that I build my campaign URLs correctly as I won't get another chance to fix it. Correct me where I am wrong: Campaign Name: Launch Campaign Medium: Email Campaign Source: Type1 or Type2 (I can break it down for different types of email, right?) For apps: Campaign Name: Launch Campaign Medium: Apps Campaign Source: App1 or App2 (I can break it down here for different apps, right?) What if I want to track two different links within a single email or a single app? Any way of tracking them individually too but still keeping to track them as one because tracking them as one makes more sense for me. Campaign Term & Campaign content is irrelevant in my case, or I can/should use them for something? And I will also be tracking traffic of different apps. Should I do more? Let me know if my scenario wasn't clear enough & I need to explain more.

    Read the article

  • Facebook connect vs. OpenID

    - by digit1001
    I just started working on a new project that has a general login feature. One suggestion in a meeting was to look into Facebook Connect or OpenID as an alternative. I'm curious if there's one that has less of a learning curve, or if they can both be used on the same site. Also, when you use either, do you have them initially create the account and just get a verify True/False back that you then use to set up a local user account? I what about forgotten passwords? I'm kind of curious as to best practices for integrated this type of login with a "traditional" one where you store the user info yourself. Thanks, D

    Read the article

  • Linking with an image: Background vs <img>

    - by FreshCode
    What is considered best practice (semantically) when using text with an image to link to an internal page or category? Option 1 <nav> <a href="/kittens"> <img src="kittens.png" /> <span>Kittens</span> </a> <a href="/puppies"> <img src="puppies.png" /> <span>Puppies</span> </a> </nav> Option 2 <nav> <a href="/kittens" class="kittens">Kittens</a> <a href="/puppies" class="puppies"><span>Puppies</a> </nav> where the CSS is defined: a.kittens { background-image:url("kittens.png"); width:40px; height:60px; } a.puppies { background-image:url("puppies.png"); width:40px; height:60px; } Should I use a styled background for the link, or an <img> inside the anchor element?

    Read the article

  • Application for google adsense rejected twice due to Unacceptable site content

    - by Bootcamp
    I have a technical blog site at techaxe.com . I put up technical articles over there quite frequently. I applied for google adsense an got my application rejected twice. The issue reported both times was Unacceptable site content. I read the content policy by google and found nothing that would indicate that the content i have on my blog is unacceptable to them. Can some one please guide me as to what should be done so that i can get my adsense application accepted.

    Read the article

  • Some Facebook Pages Show Tabs On The Top Of Page, And Others On The Left. Why?

    - by mickburkejnr
    Hi everyone, I am developing a Facebook page for my web design/development business, and I've noticed that the page I have created has the tabs aligned to the left hand side of the page underneath the page image. I then look at other pages such as Wetherspoons and Porsche and their tabs are aligned to the top of the page. Why is it like this? Is there a configuration I've missed that allows you to change the layout of the tabs? Cheers!

    Read the article

  • mod_rewrite and SEO friendliness

    - by John Doe
    My website has an atypical structure and I'm not sure if this could create problems in the long run, specially for SEO positioning purposes. I have a unique, large PHP script, and I use the Apache module mod_rewrite in the .htaccess file to create friendly URLs, for example: RewriteRule ^$ /index.php?section=Main RewriteRule ^createArticle$ /index.php?section=Main&view=CreateArticle RewriteRule ^configuration$ /index.php?section=Configuration RewriteRule ^article/([0-9]{1,10})$ /index.php?section=Article&view=Default&id=$1 RewriteRule ^deleteArticle/([0-9]{1,10})$ /index.php?section=Article&view=Delete&id=$1 RewriteRule ^reportArticle/([0-9]{1,10})$ /index.php?section=Article&view=Report&id=$1 RewriteRule ^logIn$ /index.php?section=Authentication ... So, www.example.com/index.php?section=Article&view=Default&id=105 would become www.example.com/article/105. The only real physical file is index.php, in which the parameters of the URL queried is processed and the corresponding result is outputted. My question is, do the crawling robots (e.g. Googlebot) recognize these links? Do they index the resulting HTML outputted by index.php with the specified parameters as if it was a actual HTML file? Also, would this become a problem when creating a Sitemap?

    Read the article

  • Wordpress Theme

    - by HotPizzaBox
    I'm trying to create a basic wordpress theme. As far as I know the basic files I need are the style.css, header.php, index.php, footer.php, functions.php. Then it should show a blank site with some meta tags in the header. These are my files: functions.php <?php // load the language files load_theme_textdomain('brianroyfoundation', get_template_directory() . '/languages'); // add menu support add_theme_support('menus'); register_nav_menus(array('primary_navigation' => __('Primary Navigation', 'BrianRoyFoundation'))); // create widget areas: sidebar $sidebars = array('Sidebar'); foreach ($sidebars as $sidebar) { register_sidebar(array('name'=> $sidebar, 'before_widget' => '<div class="widget %2$s">', 'after_widget' => '</div>', 'before_title' => '<h6><strong>', 'after_title' => '</strong></h6>' )); } // Add Foundation 'active' class for the current menu item function active_nav_class($classes, $item) { if($item->current == 1) { $classes[] = 'active'; } return $classes; } add_filter( 'nav_menu_css_class', 'active_nav_class', 10, 2); ?> header.php <!DOCTYPE html> <html <?php language_attributes(); ?>> <head> <meta charset="<?php bloginfo('charset'); ?>" /> <meta name="description" content="<?php bloginfo('description'); ?>"> <meta name="google-site-verification" content=""> <meta name="author" content="Your Name Here"> <!-- No indexing if Search page is displayed --> <?php if(is_search()){ echo '<meta name="robots" content="noindex, nofollow" />' } ?> <title><?php wp_title('|', true, 'right'); bloginfo('name'); ?></title> <link rel="stylesheet" type="text/css" href="<?php bloginfo('stylesheet_url'); ?>" /> <?php wp_head(); ?> </head> <body> <div id="page"> <div id="page-header"> <div id="page-title"> <a href="<?php bloginfo('url'); ?>" title="<?php bloginfo('name'); ?>"><?php bloginfo('name'); ?></a> </div> <div id="page-navigation"> <?php wp_nav_menu( array( 'theme_location' => 'primary_navigation', 'container' =>false, 'menu_class' => '' ); ?> </div> </div> <div id="page-content"> index.php <?php get_header(); ?> <div class="page-blog"> <?php get_template_part('loop', 'index'); ?> </div> <div class="page-sidebar"> <?php get_sidebar(); ?> </div> <?php get_footer(); ?> footer.php </div> <div id="page-footer"> &copy; 2008 - <?php echo date('Y'); ?> All rights reserved. </div> </div> <?php wp_footer(); ?> </body> </html> I activated the theme in wordpress. But it just shows nothing. Not even if I view the page source. Can anyone help?

    Read the article

  • 302 Redirect Issue for Joomla 2.5.7 version site

    - by DDD
    For my site i am using Joomla 2.5.7 version and FB comments tools for the articles in the site. i am getting the 302 redirect problem for the FB comments for the Articles to which i post. I have checked the url's here http://www.webconfs.com/http-header-check.php and got the following result with 302 redirect. for http://www.fijoo.com HTTP/1.1 302 Moved Temporarily = Date = Wed, 21 Nov 2012 09:46:39 GMT Server = Apache/2.2.22 (Unix) mod_ssl/2.2.22 OpenSSL/1.0.0-fips mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 mod_perl/2.0.6 Perl/v5.10.1 X-Powered-By = PHP/5.3.16 Set-Cookie = =en-GB; expires=Wed, 21-Nov-2012 10:46:40 GMT LOCATION = / Content-Length = 0 Connection = close Content-Type = text/html How to overcome this anyone please help.

    Read the article

  • Can I make Google Analytics set its cookies on just a subdomain? (I.e. www.domain.com, not domain.com)

    - by Paul D. Waite
    I’m using Google Analytics on a site — let’s call it www.domain.com. My Google Analytics website profile is for www.domain.com, and my only report is set up for www.domain.com. Requests to domain.com redirect permanently to www.domain.com. I’ve got the regular Analytics JavaScript on my index page for the domain. For some reason, it seems to be setting its cookies for domain.com instead of www.domain.com. This is unfortunate, as I’ve got cdn.domain.com set up as a CDN using Amazon Cloudfront, so I’d rather not have useless cookies (Analytics seems to set four cookies) cluttering up those requests. How can I make Analytics set cookies for www.domain.com instead of domain.com?

    Read the article

< Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >