Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 172/685 | < Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >

  • Looking for a CDN

    - by Bill
    Most of the CDN's that I've seen require you to upload your content in advance. I'm looking for a CDN that, upon receiving a request for a resource it hasn't seen, will contact my application server. If the application server returns something, it should be sent to the user and then cached in the CDN. If not, it should just return a 404. If the user requests an unexpired item, the CDN should just serve it without bothering my app server. Does anything like this exist? Is there a way to get Cloudfront to work like this?

    Read the article

  • Dotted subdomain name or new domain?

    - by Catalin Ilinca
    I have a company website hosted at www.BRAND.com (where BRAND is a generic name). The company want to develop a "micro website" for one of their campaigns, named "Inspired By BRAND". I have two directions: inspired.by.BRAND.com - which I personally don't like too much. I don't know why but I don't recall any web address similar to this one subdomain.subdomain.domain.com. inspired.BRAND.com - which I this is best suited for it. Fewer dots and similar to "more friendly" addresses subdomain.domain.com. Any hints, guidelines, any thoughts is well appreciated. Thanks in advance

    Read the article

  • Why google return soft 404 when I redirect on signup page?

    - by Hettomei
    Since one month, I've got an increased "soft 404" reported by google webmaster tools but work well for users. I made some fix but can't figure out how to solve it. Configuration (maybe useless): I have a website built with rails 3.1 Authentication is handled by the gem Devise Problem: On this page http://en.bemyboat.com/yacht-charter/9965-sailboat-beneteau-oceanis-43 when you click on "Ask a Boat request" (a simple form, in GET to : http://en.bemyboat.com/boat_requests/new/9965) you are redirected with the http status 302 to sign in, and then sent back to the new page if successfully sign in. Google tells me that the link on "ask a boat request" returns a soft 404. I can't make this form in "POST" (which will solve the problem) because we need to automatically redirect user to the good page after sign in. (the Gem Devise memorize the "get" link) To simplify, the question is: how to protect a private page with authentication, reached with a simple "get" and not to be penalized by google with "soft 404". Thank you. PS : this website suffer a lot about english translation... please don't care.

    Read the article

  • Fixing Google Chrome text antialias for .ttf fonts

    - by 71GA
    I have found a topic which presents a solution on how to get antialising working in Google Chrome - Windows, but they use .svg format. I have a .ttf format and I import all of my fonts like this at the moment: @font-face {font-family: "t1"; src: url(../fonts/title/circle.ttf);} @font-face {font-family: "t2"; src: url(../fonts/title/sanserifing.ttf);} @font-face {font-family: "t3"; src: url(../fonts/title/serveroff.ttf);} @font-face {font-family: "t4"; src: url(../fonts/title/pupcat.ttf);} How can I achieve antialising done right in Google Chrome Windows?

    Read the article

  • Subdomain still times out after set up a month ago

    - by user8137
    I'm a newbie on this and this has probably been asked already but the subjects online were close but too vague in there answers so I've probably really messed this up. I would really appreciate specific step by step instructions. This is what I'd like to do: use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like ftp). We use Network Solutions to house domain.com. We recently added a new ip address to point to www.high-res.domain.com. I gave the ip address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct ip address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). Tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a Cname when you look at it. Under the Domain it has two entries one to the subdomain and the other to the website host each with separate ip addresses. Is this correct? The website people are too busy on another project to research it further and my friends haven't gotten back to me. Please help. Thanks KK

    Read the article

  • Finding a Payment Gateway?

    - by Lynda
    I have a client who would like to sell glass pipes online. The problem I run into is with the payment gateway. Glass pipes fall into two categories drug paraphernalia or tobacco product. This leads me to here and asking: Does anyone know of a payment gateway that will process payments for glass pipes? Note: Doing some Google searching I read that Authorize.net will accept tobacco but when I spoke with them they said they do not.

    Read the article

  • How does Google Analytics aggregate the Count of Visits (Frequency & Recency Report)?

    - by Brian Dant
    Here's my simple understanding of Count of Visits: Each person that comes to my site gets one "count" for each visit. They are put into a bucket of people with the same number of total counts -- if you visit twice, you are in the two bucket, if you visit six times, you are in the six bucket. From there, a report (Frequency & Recency) makes a line for each bucket and reaches into the bucket and totals the number of people in that bucket, putting that total in the second column. My Question: Will a two month report automatically put someone into two buckets, and put them on two separate lines in the Count of Visits table? This explaination makes it seem like a two-month long report will put the same person into a bucket twice, one bucket for each month. The two-month report will then show that person's visits on two different lines, instead of aggregating them. Example for Clarification: Bob comes to my site three times in January and seven times in February. I run a report for Jan 1 -- Feb 28. Will Bob be on both the Three Count line and the Seven Count line, or will he be on the Ten Count line?

    Read the article

  • Uploading a non-finished website

    - by Daniel
    I have a pretty basic question. I developed a neet little website wich I'm ready to upload, but still needs a bit of work. The designer needs the html to do his work so the website needs to be uploaded. Besides that, I have to correct a couple details, do the friendly-urls, etc. What's the best way to set up the webpage in the definitive hosting with the defintive domain, blocking it to any unknown users and without affecting affecting SEO and those kind of things. If I were to just upload it, the non-definitive website might be crawled by a SE-bot. Thanks!

    Read the article

  • You don't have permission to access /index.php on this server

    - by Tran Dinh Thoai
    I made a 'login with OpenID' page and I had a error when OpenID provider return to my page: You don't have permission to access /index.php on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. If I remove parameters which are returned by OpenID provider, the page run well. How can I fix this problem? The login page that cause error is: http://bryox.com/login

    Read the article

  • SEO blog Indexing: wordpress.com subdomain vs a registered domain?

    - by rumspringa00
    I've used WordPress for a few of my client's sites, mostly small businesses and eCommerce sites. I have found through Google Analytics as well as the All in One Webmaster plugin that when it comes to social media, using WordPress is a surefire way of getting your site indexed by Google and occasionally Bing and Yahoo. Since I am a heavy WP user, I'd like to contribute by registering a dot WordPress domain for my portfolio. When using a WP installation concurrently with a WP domain, e.g. myportfolio.wordpress.com, will the site be more or less likely to be indexed rather a generic myportfolio.com domain? I've seen mixed opinions where people seem to favor a WP domain for URL output where others say that it's a moot point, and that Google will not favor a WP domain over a dot com domain as long as your meta tags are updated and content is keyword optimized. I tend to disagree and believe a WP domain would more likely be indexed and output more URLs over an individual, laconic domain like myportfolio.com. Am I wrong?

    Read the article

  • Google analytics - drop in traffic

    - by user1001421
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • What are the correct settings for a subdomain in ZoneEdit?

    - by user99572_is_fine
    I want to create a subdomain for a site hosted by Jimdo (a DIY website builder). Jimdo does not allow subdomains however. I am trying to find a workaround where a subdomain is hosted elsewhere but everything else remains as it is. E.g. I use their email service and I want to keep it. The domain is not hosted by Jimdo, but by a host that allows me to edit my zones. It points to the Jimdo NS. I have independent hosting where I have NS information. This is where I want to host my subdomain. My thinking was that I could use ZoneEdit as a "fork" that allows me to keep using my Jimdo page like before and, at the same time, directs a subdomain to another host. Provided this is possible: Question: How do I configure ZoneEdit CNAME or NS records to forward visitors to my website and my email to my Jimdo mail account while pointing a subdomain to another host?

    Read the article

  • Are shorter URLS better for SEO?

    - by articlestack
    Many people shorten their URLs. But as per my understanding it creates overhead of extra redirection, other can not guess about the target article with their url, and it should be less friendly for "inurl:..." type search. Should I shorten the URLs of my sites? Is there any advantage with short URLs besides the fact that they take fewer characters in anchor tags on the page (good for site loading)?

    Read the article

  • Trigger IP ban based on request of given file?

    - by Mike Atlas
    I run a website where "x.php" was known to have vulnerabilities. The vulnerability has been fixed and I don't have "x.php" on my site anymore. As such with major public vulnerabilities, it seems script kiddies around are running tools that hitting my site looking for "x.php" in the entire structure of the site - constantly, 24/7. This is wasted bandwidth, traffic and load that I don't really need. Is there a way to trigger a time-based (or permanent) ban to an IP address that tries to access "x.php" anywhere on my site? Perhaps I need a custom 404 PHP page that captures the fact that the request was for "x.php" and then that triggers the ban? How can I do that? Thanks! EDIT: I should add that part of hardening my site, I've started using ZBBlock: This php security script is designed to detect certain behaviors detrimental to websites, or known bad addresses attempting to access your site. It then will send the bad robot (usually) or hacker an authentic 403 FORBIDDEN page with a description of what the problem was. If the attacker persists, then they will be served up a permanently reccurring 503 OVERLOAD message with a 24 hour timeout. But ZBBlock doesn't do quite exactly what I want to do, it does help with other spam/script/hack blocking.

    Read the article

  • AdSense Custom Search Ads - custom quesry

    - by Alex
    i'm trying to set up a custom search ad, but I am nost sure about the query. On the site it says (https://developers.google.com/custom-search-ads/docs/implementation-guide) 'query' should be dynamic based on your page. This variable targets the ads and therefore should always match what the user on your site has just performed a search for. Now, what I understand is: I have to program my page so that the query variable contains some custom words. Am I right? If a user gets to my site through clicking on an adsense, there is no way to "know" what the user looked for and display my query accordingly, right? Thanks for any help!

    Read the article

  • Oversizing images to produce better looking pages?

    - by Joannes Vermorel
    In the past, improper image resizing used to be a big no-no of web design (not mentioning improper compression format). Hence, for years I have been sticking to the policy where images (PNG or JPG) are resized on the server to match the resolution pixel-wise they will have with the rendered page. Now, recently, I hastily designed a HTML draft with oversized images, using inline CSS style such as width:123px and height:123px to resize the images. To my (slight) surprise, the page turned out to look much better that way. Indeed, with better screen resolution, some people (like me), tend to browse with some level of zoom (aka 125% or even 150% zoom), otherwise fonts are just too small on-screen. Then, if the image is strictly sized, the enlarged image appears blurry (pixel interpolation effect), but if the image is oversized the results is much better. Obviously, oversizing images is not an acceptable pattern if your website is intended for mobile browsing, but is there case where it would be considered as acceptable? Especially if the extra page weight is small anyway.

    Read the article

  • OpenSearchDescriptions good or bad signal in Google's eyes?

    - by JeremyB
    I noticed a site using this tag: <link rel="search" type="application/opensearchdescription+xml" title="XXXXXXXXX" href="http://www.XXXXXXXXXX.com/api/opensearch" /> As I understand it (based on http://www.opensearch.org/Home), this tag is a way of describing search results (so you use it on pages which contain search results) to make it easier for other search engines to understand and use your results. Given that Matt Cutts has said Google generally frowns on "search results within search results" is using this tag a bad idea on a page that you hope to achieve a good ranking in Google?

    Read the article

  • In c-panel mail goes in spam instead of inbox in gmail

    - by Robin Jain
    I have c-panel vps server I have create a domain in the same server but when I sent a mail through webmail to gmail email id it goes into spam. Note--->Mail ip note blacklisted Spf records enable DKIM enable reverse dns are perfect ====================================================================== Email header Information: Delivered-To: [email protected] Received: by 10.143.93.13 with SMTP id v13csp119806wfl; Fri, 6 Jul 2012 08:01:36 -0700 (PDT) Received: by 10.182.52.42 with SMTP id q10mr26133912obo.46.1341586895571; Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Return-Path: <[email protected]> Received: from lakshyacs-u.securehostdns.com ([50.97.147.134]) by mx.google.com with ESMTPS id fx3si18028369obc.144.2012.07.06.08.01.35 (version=TLSv1/SSLv3 cipher=OTHER); Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) client-ip=50.97.147.134; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) [email protected] Received: from localhost.localdomain ([127.0.0.1]:39016 helo=harishjoshico.com) by lakshyacs-u.securehostdns.com with esmtpa (Exim 4.77) (envelope-from <[email protected]>) id 1SnA2J-0006Nq-05 for [email protected]; Fri, 06 Jul 2012 20:31:35 +0530 Received: from 223.189.14.213 ([223.189.14.213]) (SquirrelMail authenticated user [email protected]) by harishjoshico.com with HTTP; Fri, 6 Jul 2012 20:31:35 +0530 Message-ID: <[email protected]> Date: Fri, 6 Jul 2012 20:31:35 +0530 Subject: ggglkhl From: [email protected] To: [email protected] User-Agent: SquirrelMail/1.4.22 MIME-Version: 1.0 Content-Type: text/plain;charset=iso-8859-1 Content-Transfer-Encoding: 8bit X-Priority: 3 (Normal) Importance: Normal X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - lakshyacs-u.securehostdns.com X-AntiAbuse: Original Domain - gmail.com X-AntiAbuse: Originator/Caller UID/GID - [47 12] / [47 12] X-AntiAbuse: Sender Address Domain - harishjoshico.com jhkhl ================================================================

    Read the article

  • Restricting crawler activity to certain directories with robots.txt

    - by neimad
    I would like to use robots.txt to prevent indexing of some parts of my website. I want search engines to index only the / directory and not search inside my controllers. In my robots.txt, I have this: User-Agent: * Disallow: /compagnies/ Disallow: /floors/ Disallow: /spaces/ Disallow: /buildings/ Disallow: /users/ Disallow: / I put this file in /mysite/public. I tested the file with a robots.txt validator and got no errors. However, Google always returns the result of my site. For testing, I added Disallow: /, but again, Google indexed all pages. floors, spaces, buildings, etc. are not physical directories. Is this a bug? How can I work around it?

    Read the article

  • Cropping images & SEO

    - by user1181950
    So I have a page with a bunch of images with largely varying sizes. Also the layout of the page is such that the images are all in the shape of square tiles, so just resizing will cause distorted images. What I've been doing previously is when users upload images, I resize and crop them appropriately and display the new image as the thumbnail and load full image when user clicks on it. However, I just realized this is an issue with SEO as google will crawl the thumbnails and stick the thumbnails on Google Images instead of the full images. Is there any way to show a cropped/resized image but have Google Image show the full image? I can do something with css using an enclosing div and overflow:hidden, but I'd imagine the performance on that would be pretty bad. Any suggestions? Thanks! PS. I saw this (Make google index the actual image not the thumbnail), but in my case I have users continuously uploading images, and the database of images is always changing and pretty big (thousands), so sitemap will be pretty unwieldy..

    Read the article

  • How can I use domain masking without having self referral in Google Analytics

    - by Cdore
    I have one old domain that points to a website's server's ip (let's call it www.oldsite.com). I have a new one, www.newsite.com, that is set up to be forwarded to a specific page on the website. Due to the way the host of newsite.com places the website in a frame, in Google Analystics, the newsite.com is listed as a source rather than the source they were at before hand, causing a self referral. A solution is to edit the code of the iframe as I looked up, but there's no way to really edit the host's masking source code of course. Another solution I did previously was have www.newsite.com point to the address that www.oldsite.come pointed to. It solved the analytics problems, but in exchange, the url masking no longer worked. In the address bar, it came up as www.oldsite.com. Is there a way to make me have url masking and be able to forward to agree with google analytics? The server of the website is hosted on a cloud server, if this is anymore information.

    Read the article

  • What sort of phone numbers are allowed as the WHOIS contact?

    - by billpg
    I'm getting a non-trivial amount of scam phone calls to the phone number contact listed in WHOIS. Could I change it to a premium rate line? If the scammers want to talk to me so much, make them pay for the privilege! Seriously though, are there any restrictions on the type of phone number I can give as my WHOIS contact? Notwithstanding that it is a phone number which can be used to contact the domain holder. In the UK, cell phones are more expensive for the caller to call than land-lines, so I suspect a significant number of people are already listing a "premium rate" phone number.

    Read the article

  • Make blogger load faster

    - by Wladimir Ivanov
    all. I use blogger as a platform for electronic music blog. Because of the thematics of the blog I embed many iframes (Youtube & Soundcloud). Of course this makes the articles to load slow. Almost each article on this blog consists of some text and many iframes below. What should I do in this particular case in order to make the articles (pages) load faster. Is there any available solution or I should use some jquery like lazy load to load iframes once the scroller reaches them? Any help is greatly appreciated.

    Read the article

  • GA and Unique visitors again

    - by DDEX
    I take care of a company intranet and measure the traffic with GA. I am absolutely sure that there are no more than 5000 URLs in our company and it is impossible to check the intranet from outside the company network. Yet when I check the number of Unique Visitors (UV) in the last year GA says there were 36.500 of them...How is that possible? I thought UV should measure each URL only once in the given time period. Could anybody explain how this actually works? Can it be that the cookie trackers expire after some time and are counted more then once?

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain?

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

< Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >