Search Results

Search found 14702 results on 589 pages for 'dolby pro logic'.

Page 122/589 | < Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >

  • Remove IP address from the URL of website using apache

    - by sapatos
    I'm on an EC2 instance and have a domain domain.com linked to the EC2 nameservers and it happily is serving my pages if I type domain.com in the URL. However when the page is served it resolves the url to: 1.1.1.10/directory/page.php. Using apache I've set up the following VirtualHost, following examples provided at http://httpd.apache.org/docs/2.0/dns-caveats.html Listen 80 NameVirtualHost 1.1.1.10:80 <VirtualHost 1.1.1.10:80> DocumentRoot /var/www/html/directory ServerName domain.com # Other directives here ... <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header set Cache-Control "max-age=290304000, public" </FilesMatch> </VirtualHost> However I'm not getting any changes to how the URL is displayed. This is the only VirtualHost configured on this site and I've confirmed its the one being used as I've managed to break it a number of times whilst experimenting with the configuration. The route53 entries I have are: domain.com A 1.1.1.10 domain.com NS ns-11.awsdns-11.com ns-111.awsdns-11.net ns-1111.awsdns-11.org ns-1111.awsdns-11.co.uk domain.com SOA ns-11.awsdns-11.com. awsdns-hostmaster.amazon.com. 1 1100 100 1101100 11100

    Read the article

  • Can I enable GZIP on Godaddy?

    - by Dave
    One of my sites lives on GoDaddy's bottom-of-the-line cheap hosting, I have the correct code in my .htaccess file, but it's not compressing because mod_deflate is not loaded. How do I enable that? The best I've found is this article which suggests I use PHP to zip everything (which is going to be more work than just changing hosting companies): " Add the following code to the very top of your Web pages above the DOCTYPE: if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) b_start("ob_gzhandler"); else ob_start();

    Read the article

  • Google Author information in search results still havent displayed my details in search results

    - by Jayapal Chandran
    I followed the following instructions but still not clear whether i had completely understood it. http://www.google.com/support/webmasters/bin/answer.py?answer=1408986 http://www.labnol.org/internet/author-profile-in-google/19775/ I did the above last week and i did not find my picture in google search result. First i added google + link in certain web pages and in my google profile i added those pages which had google + anchor link with rel=author tag. After updating i used the following to verify. http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fvikku.info%2Fcodesnippets%2Fphp%2F&view= You can see that my pic is appearing at the right. here is a screen shot. so, what am i missing? why it is not in the search result. The author of labnol.org said it will take 3 days for my profile photo link to appear... ? Google has stated the following Note that there is no guarantee that a Rich Snippet will be shown for this page on actual search results. For more details, see the FAQ( http://knol.google.com/k/google-rich-snippets-tips-and-tricks#Frequently_Asked_Questions ). Fingers crossed. Thoughtful.

    Read the article

  • Serve most of a domain with Apache, but use mod_proxy to serve some URLs from Lighttpd

    - by Alex Pineda
    So we wish to host some pages on a new server with apache2, and embed some of our old content & functionality from another server with lighttpd in an iframe. I'm looking at this configuration from the apache docs (http://httpd.apache.org/docs/2.2/vhosts/examples.html#page-header) under "Using Virtual_host and mod_proxy" together. <VirtualHost *:*> ProxyPreserveHost On ProxyPass / http://192.168.111.2/ ProxyPassReverse / http://192.168.111.2/ ServerName hostname.example.com </VirtualHost> The only issue is that I want to proxy only on a subdomain, or even better, if I can keep the top domain and proxy only if the url contains a particular path ie. "/myprocess.php". So in essence the DNS will point to the apache2 as the "master router".

    Read the article

  • Sending HTML to Gmail always lands in Spam

    - by cartaysm
    I am having an issue with sending HTML emails to Gmail. I can send them to Yahoo, Hotmail, RR, AOL, etc. with no problem at all, but when I send them to Gmail I get kicked to spam. I have checked my IP with a lot of different list to make sure it is not listed anywhere, which it is not. spamhaus = is not listed in the DBL abuse.net = is not listed in the SBL abuse.net = is not listed in the PBL abuse.net = is not listed in the XBL spamcop = not listed in bl.spamcop.net host 24.172.204.xxx xxx.204.172.24.in-addr.arpa domain name pointer xxxevents.com. host xxxevents.com xxxevents.com has address 24.172.204.xxx xxxevents.com mail is handled by 10 mail.xxxevents.com. I am just trying to send a very VERY basic HTML message (listed below). I use an Ubuntu server, swiftmailer, multipart/alternative (HTML & plain), SPF = pass, and I am going to setup DKIM today to see if that fixes it (but I doubt it will)... For now I will only post the message I sent that gets kicked to spam and can provide any details needed. <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head><title>Triathlon</title></head> <body> <table cellpadding="0" cellspacing="0"> <tr> <td> <p>Thank you for attending our 4th annual Triathlon/Duathlon/5k at Hueston Woods State Park on August 12th. This event is held annually to raise research funding for Crohn's Disease, Ulcerative Colitis, and Muscular Dystrophy diseases.</p> </td> </tr> <tr> <td> <p>As you know the results and pictures have been posted on our home page at since Sunday 8/13/2012. Now we also have updated our Facebook page with those photos and you can start tagging yourself or downloading the pictures now! <br /> our page and tag yourself at </p> <p> test test </p> <p>Race day events is professionally managed by Speedy-Feet</p> </td> </tr> </table> </body> </html> Just plain text works great, I thought maybe wording was messing me up but not the case... I am almost done install opendkim so I will be able to rule that out very soon. Edit: Okay installed opendkim and I am getting passing results so I sent the html I posted above it went through just fine. So now when I start to add a few more lines I am getting kicked back to spam again. Here is updated html code: ` <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head><title>Triathlon</title></head> <body> <table cellpadding="0" cellspacing="0"> <tr> <td> <center><a href='http://xxxevents.com' target="_blank"> <font face="Verdana, Arial, Helvetica, sans-serif" color="#666666" size="2"> <img src="http://xxxevents.com/marketemailimages/xxxlogo.png" alt="xxx It Events | Raising funds for Crohns, Colitis, and Muscular Dystrophy" border="0" /> </font></a></center> </td> <tr> <td> <p>Thank you for attending our 4th annual Triathlon/Duathlon/5k at Hueston Woods State Park on August 12th. This event is held annually to raise research funding for Crohn's Disease, Ulcerative Colitis, and Muscular Dystrophy diseases.</p> </td> </tr> <tr> <td> <p>As you know the results and pictures have been posted on our home page at since Sunday 8/13/2012. Now we also have updated our Facebook page with those photos and you can start tagging yourself or downloading the pictures now! <br /> our page and tag yourself at </p> <p> test test </p> <p>Race day events is professionally managed by Speedy-Feet</p> </td> </tr> </table> <table width="100%" border="0" cellspacing="0" cellpadding="0"> <tr> <td valign="top"> <div align="center" style="font-family:Verdana, Arial, Helvetica, sans-serif; font-size:10px;"><br />PO Box xxx Maineville, OH 45039<br /> <a href="mailto:[email protected]">[email protected]</a> | <a href='http://xxxevents.com' target="_blank">xxxevents.com</a><br /> <br /> </div> </td> </tr> </table> </body> </html>`

    Read the article

  • Why are the tags on my site using wordpress being indexed instead of the page?

    - by Bernard
    I can't figure out why my tags are being indexed by google and not my actual posts. So in google, my posts are showing up as mysite.com/tags/post and I of course I want it to look like mysite.com/category/actualpost. Any ideas what could be wrong? My domain is 3 years old and I just started a new focus of an existing site. I can't figure this out! There is no duplicate content, I have a sitemap submitted to webmaster tools and robots.txt...I have everything I need. This is the first time something like this has happened to me. Let me know if anyone has any ideas.

    Read the article

  • What is the right way to do this HTML: header with icon linked

    - by Hell Awaits
    I know how to make these examples look and behave the same. But I would like to know which is the right way to build a HTML structure. <a><img><h1></h1></a> - looks wrong because an inline element is inside of a block element <a><img></a><h1><a></a></h1> - the same a-element is defined twice. Also I'm not sure about markup inside headers Any other solutions ?

    Read the article

  • Canonicalization of single, small pages like reviews or product categories

    - by Valorized
    In general I pretty much like the idea of canonicalization. And in most cases, Google explains possible procedures in a clear way. For example: If I have duplicates because of parameters (eg: &sort=desc) it's clear to use the canonical for the site, provided the within the head-tag. However I'm wondering how to handle "small - no to say thin content - sites". What's my definition of a small site? An Example: On one of my main sites, we use a directory based url-structure. Let's see: example.com/ (root) example.com/category-abc/ example.com/category-abc/produkt-xy/ Moreover we provide on page, that includes all products example.com/all-categories/ (lists all products the same way as in the categories) In case of reviews, we use a similar structure: example.com/reviews/product-xy/ shows all review for one certain product example.com/reviews/product-xy/abc-your-product-is-great/ shows one certain review example.com/reviews/ shows all reviews for all products (latest first) Let's make it even more complicated: On every product site, there are the latest 2 reviews at the end of the page. So you see, a lot of potential duplicates. Q1: Should I create canonicals for a: example.com/category-abc/ to example.com/all-categories/ b: example.com/reviews/product-xy/abc-your-product-is-great/ to example.com/reviews/product-xy/ or to example.com/review/ or none of them? Q2: Can I link the collection of categories (all-categories/) and collection of all reviews (reviews/ and reviews/product-xy/) to the single category respectively to the single review. Example: example.com/reviews/ includes - let's say - 100 reviews. Can I somehow use a markup that tells search engines: "Hey, wait, you are now looking at a collection of 100 reviews - do not index this collection, you should rather prefer indexing every single review as a single page!". In HTML it might be something like that (which - of course - does not work, it's only to show you what I mean): <div class="review" rel="canonical" href="http://example.com/reviews/product-xz/abc-your-product-is-great/"> HERE GOES THE REVIEW</div> Reason: I don't think it is a great user experience if the user searches for "your product is great" and lands on example.com/reviews/ instead of example.com/reviews/product-xy/abc-your-product-is-great/. On the first site, he will have to search and might stop because of frustration. The second result, however, might lead to a conversion. The same applies for categories. If the user is searching for category-Z, he might land on the all-categories page and he has to scroll down to the (last) category, to find what he searched for (Z). So what's best practice? What should I do?

    Read the article

  • Where can I find comscore rank?

    - by Joyce Babu
    Recently one ad network rejected my registration stating that my site doesn't match their minimum monthly impressions, even though the site serves thrice the required page views. When I contacted them for details, their representative hinted that they are using comscore data for screening submissions. Where can I view my site's comscore ranking and details? Update I was able to find the traffic by tagging my site with comScore Direct.

    Read the article

  • JS and CSS caching issue: possibly .htaccess related

    - by adamturtle
    I've been using the HTML5 Boilerplate for some web projects for a while now and have noticed the following issue cropping up on some sites. My CSS and JS files, when loaded by the browser, are being renamed to things like: ce.52b8fd529e8142bdb6c4f9e7f55aaec0.modernizr-1,o7,omin,l.js …in the case of modernizr-1.7.min.js The pattern always seems to add ce. or cc. in front of the filename. I'm not sure what's causing this, and it's frustrating since when I make updates to those files, the same old cached file is being loaded. I have to explicitly call modernizr-1.7.min.js?v=2 or something similar to get it to re-cache. I'd like to scrap it altogether but it still happens even when .htaccess is empty. Any ideas? Is anyone else experiencing this issue?

    Read the article

  • difference in using third party social buttons and directly integrating each social buttons ourselves

    - by Jayapal Chandran
    I wanted to add specific social buttons to my article. I used ShareThis. It gives a facebook like button, google plus button, etc... by default. were as in other articles of different modules i had integrated the facebook like by myself by following the documentation (including markup in the head section) What is the difference in adding manually with many markups and using third party code? Will that affect SEO or any other advantage over the respective social networking site (here for example facebook and google plus)?

    Read the article

  • Set up development site on another server/host

    - by Ofeargall
    I'm developing a site for a client. They've got a site now that's hosted at hosting.com. I'm going to move them to my VM hosting solution at edge web but I want to run some tests and have the client approve the site before changing the name servers to the new site/hosting location. How do I make this happen? I'm running a red hat/Apache on linux for the edge web hosting. I don't have control of the domain name (i.e. the client controls that right now). Edgeweb has set up a dns zone for the domain name so that when the time comes to switch we're ready to go. I'm a web developer and I understand the technologies that make a user experience 'work' but I'm unfamiliar with the server jargon and all that so, please be patient. Thanks in advance.

    Read the article

  • A mechanism to include site title in every page, but not in <title> element

    - by Saeed Neamati
    Each site can have a name. For example, site x. Each page also can have a name (or a title) that should appear in <title> tag in the header. However, many websites out there use the combination site name - page name to provide the value for <title> tag. I find it a little far from being semantic. On the other hand, if you only include page title in <title> tag, search engines won't find your site by its name. For example, if your site's name is Thought Results and you don't include it in page titles, then if you search for Thought Results, you won't find your site in SERPs. Thus I'm searching for a mechanism to both include site title (not page title) in every page, and also only include page title in <title> tag to get more semantic results. Is there any way to achieve this?

    Read the article

  • Shopping Cart URL Structure

    - by Drew
    In regards to URL structure when it comes to guests and authenticated users, am I able to track traffic associated with both paths, but at the same time track total conversions going through the shopping cart? I have set up the following URL structure: Authenticated users follow this path: /cart /checkout /checkout-confirmation-ty Guests go like such: /cart /checkout-guest /checkout-confirmation-guest-ty can I track the authenticated and guest users separately? is this possible with Google Analytics?

    Read the article

  • Help deploying using Capistrano to HostGator

    - by Kyle Macey
    My company uses HostGator to host our web sites, and I'm having a heck of a time figuring out what my final steps are to get a functioning RoR app up there. I've got all the way up to configuring mongrel (I think?) and being able to run deploy:cold without any errors. However, I can't seem to get the app to show up in the designated CPanel area (HG says the name "current" is already reserved for another application), and I'm not sure which port was allowed for me to use. I've opened tickets with Customer Support just to be told that "You can't access the database with root"... Totally unrelated to my question... So I think I'm in the final stretch and if anyone has any insight or experience with HostGator, please cue me in.

    Read the article

  • Ensure we're found in Facebook search for both full & abbreviated company names?

    - by hawbsl
    We have a client with a facebook page, let's say his company is called Bob Roberts Super Widgets. And if you search in Facebook for Bob Roberts Super Widgets then up he pops. But the shorthand he's commonly known by is BR Super Widgets and indeed the website we've created for him is br-super-widgets.com. In Facebook, searching for BR Super Widgets doesn't show up our Mr Bob. We don't have a lot of Facebook expertise, so asking for help here. Does anyone know how to ensure you're found in Facebook search for both short and long company names? Have found this this similar question in the Facebook forum but the poor old questioner never got a response.

    Read the article

  • IE9 Loses Some CSS After Particular Form Submit [migrated]

    - by Asherion
    The site I am editing has a search form. For the record, there are several other forms on the site, contact and the like. This is the only one with an issue. Upon submission of the form, SOME of the styling is lost in IE9 (possibly other versions of IE, haven't tested that yet). Primarily, the margins and colors set in html and body appear to have been lost. Menus, banner, text, etc all appear to retain styles. All styles are on one sheet, that are used here... Any helpful advice? Here is the contents of the search page and the php used to check for the form, if that helps, and the css that I think is lost. THE HTML: <div id="search"> <br /> <div style="float:right;font-size:.8em;"> <form name="form_sidesearch" action="search.html" method="post"> <input type="hidden" name="action" value="search" /> <input type="text" name="search_value" value="<?php echo $systems_primary->search_value ?>" /> <input type="submit" name="submit_search" value="Search Website" /> </form> <br /> </div> </div> <?php echo stripslashes($search_results); THE PHP: <?php // -- Begin Search -------------------------------------------------------------------------------------- if($_REQUEST["action"] === "search") { if(strlen($_REQUEST["pg"]) <= 0) { $_REQUEST["pg"] = 1; } $search_results = $systems_primary->search_website("index",urldecode($_REQUEST["search_value"]),"<div class=\"listing ui-corner-all\"><a href=\"{ENTRY_URL}\" title=\"{ENTRY_TITLE}\" class=\"listing_title\">{ENTRY_TITLE}</a>{ENTRY_CONTENT} <a href=\"{ENTRY_URL}\" title=\"{ENTRY_TITLE}\" style=\"font-size:.8em;\">...read more</a></div><br /><br />",345,"all",10,$_REQUEST["pg"]); } // -- End Search ---------------------------------------------------------------------------------------- ?> THE LOST CSS (could be more): html { background-color:#F6E6C8; font-size:16px; font:Helvetica; } body { width:1027px; margin:0 auto; background-color:#ffffff; font: arial, times new roman, sans-serif; }

    Read the article

  • How can I screen clients that try to register multiple times?

    - by Aba Dov
    My company offers a bonus to every client that register. We would like to prevent people from abusing this by registering several times. we thought about filtering clients by ip (there is a problem with workplaces where all stations have the same ip) cookies (if cookies are not allowed we might lose a client) I would like your opinions on these two methods and will be glad to hear about new ones. thanks

    Read the article

  • Retroactively applying a Piwik goal to visitors

    - by Andrew Aylett
    I started receiving a large (for me) amount of traffic on one of my pages yesterday. Today, I thought that it would be useful to track goals from that page -- there's a link to my blog from it. I added the 'visited external link' goal to Piwik, and new visits are being recorded. However, it seems to me that there must be enough data in the database to retroactively apply the goal to past users -- is there a way to achieve that?.

    Read the article

  • Is this form of cloaking likely to be penalised?

    - by Flo
    I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded via backbone. I just needed some advice or opinions on likely hood of my website being penalised using the method of serving plain HTML (text, images, everything) to search engine bots and an js front-end version to normal users. This is my basic plan for my site: I plan on having the first request to any page being html which will only give about 1/4 of the page and there after load the last 3/4 with backbone js. Therefore non javascript users get a 'bit' of the experience. Once that new user has visited and detected to have js will have a cookie saved on their machine and requests from there after will be AJAX only. Example If (AJAX || HasJSCookie) { // Pass JSON } Search Engine server content: That entire experience of loading via AJAX will be stripped if a google bot for example is detected, the same content will be servered but all html. I thought about just allowing search engines to index the first 1/4 of content but as I'm considered about inner links and picking up every bit of content I thought it would be better to give search engines the entire content. I plan to do this by just detected a list of user agents and knowing if it's a bot or not. If (Bot) { //server plain html } In addition I plan to make clean URLs for the entire website despite full AJAX, therefore providing AJAX content to www.example.com/#/page and normal html to www.example.com/page is kind of our of the question. Would rather avoid the practice of using # when there are technology such as HTML 5 push state is around. So my question is really just asking the opinion of the masses on if it's likely that my website will be penalised? And do you suggest an alternative which avoids 'noscript' method

    Read the article

  • Stop bots from crawling old links with extensions

    - by Jared
    I've recently switched to MVC3 which is extension-less for the URL's, but Google and Bing have a wealth of links that they are crawling which no longer exist. So I'm trying to find out if there is a way to format robots.txt (or by some other method) to tell google/bing that any link that ends in an extension isn't a valid link... Is this possible? On pages that I'm concerned about a User having saved as a fav I'm displaying a 404 page that lists the links to take once they are redirected to the new page (I decided to not just redirect them as I don't want to maintain these forever). For Google/Bing sake I do have the canonical tag in the header. User-agent: * Allow: / Disallow: /*.* EDIT: I just added the 3rd line (in text above) and it APPEARS to do what I'm wanting. Allow a path, but disallow a file. Can anyone confirm this?

    Read the article

  • What liability concerns do advertising vendors raise, and how can I address them?

    - by Beofett
    One of the websites I administer wants to provide free advertising in the form of direct links to vendors at an event they are running. Up until now, there has been no advertising whatsoever on the site (or any of our other sites). The site is for a for-profit business. The idea of implicit endorsement of any vendors we advertise has been raised, which brought up the question of what we need to do, if anything, to protect ourselves from any potential problems such endorsement might create. I know that many sites have clauses in their Terms of Service that state that (in a nutshell) they are not responsible for any problems or grievances between the visitors to the site and any vendor advertised or linked. Are there other steps that a website typically takes when considering advertising, such as getting the advertiser to provide some sort of certification that their ad will not violate any trademarks or copyrighted material?

    Read the article

  • Tracking unique views for a site showing my advertisements [on hold]

    - by user580950
    I am in trouble. I placed and advertisement on a website in 2012. The website said they got 950,000 unique visits each month. Early in 2012 I advertised with them. The advertisement didn't worked out. I checked in 2-3 months time and I saw that the unique visitors on the site was 8,000 at that time. I immediately closed the account. I don't remember which site I used to check the unique visitors. The advertising company has filed a dispute against me. So is there any tool that can show me the 2012 stats for any website? I tried Google Trends but it doesn't show statistics.

    Read the article

  • Web Hosting Backup/Disaster Recovery Plan - Which Company?

    - by Harry Muscle
    I've been asked to look after consolidating all of our various company websites onto one host and also provide a disaster recover plan in case the chosen host goes down/out of business/etc. We're most likely going to go with HostGator as our chosen host, however, I'm not sure who to pick for our backup host. HostGator uses cPanel and has the functionality to provide regular full (ie: including configuration) backups of all the sites we host. Ideally I'm looking for a solution where we can provide these backups to another company and within a short period of time they restore all the sites onto their servers and we're back up and running. The whole disaster recover process has to be fairly straight forward from the point of view of what we need to do in case I am unavailable to assist in the disaster recovery process and no one else overly technical is available to assist (ie: take these backup files, send them to this company, and ask them to do this). Any suggestions on which company would be a good choice for this backup solution would be highly appreciated. Thanks, Harry

    Read the article

  • How do I redirect www and non but not IP

    - by Chad T Parson
    I am trying to redirect www.domain.com or domain.com to www.domain.com/temp.html I am using the following code: RewriteCond %{HTTP_HOST} ^.*$ RewriteRule ^/?$ "http\:\/\/www\.domain\.com\/temp\.html" [R=301,L] That works however I do not want to redirect IP. So if someone types in the static IP of the domain then I do not want them to be redirected to www.domain.com/temp.html Anyone have the code to take care of this?

    Read the article

< Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >