Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 142/389 | < Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >

  • Can a registrar outage affect my website?

    - by Harry Muscle
    If my registrar has an outage and their servers go down for a while, will that affect my website. I'm not using any of my registrar hosting packages or name servers, I simply have a few domain names registered with them that point to my actual hosting provider. Based on my current understanding on how these things work I believe that an outage at my registrar will not affect my websites at all, but I'd like to hear this from someone with more experience in this field.

    Read the article

  • Getting a domain sub-directory url for a new server

    - by Xianlin
    I have an web application server running tomcat and i need to publish my APIs to internet users. However I don't have a domain name for this server and I can only put the ip address of this server (e.g. 145.XXX.XXX.XXX) to point out where my API xml files are located. I have another web server running with a domain name "http://www.webserver.com" registered on the internet and I want to make use of its domain name to server my web application server API xml files location. How can I do that? using "www.webserver.com/api" or using "api.webserver.com"? which is better? Also I wonder if I want to publish a "rstp://145.XXX.XXX.XXX" web link for video streaming purpose, can I use "rstp://www.webserver.com/api" to replace it and how to do it? I always thought the url contain domain sub-directory name cannot point to another IP address, it only can point to another folder location on the webserver itself.

    Read the article

  • Slashdotted web site seeks new home

    - by Arthur Edelstein
    I am maintaining a website that contains mostly simple html (just a little php). Normally the site receives only 4000 hits per month, but it was recently slashdotted by the New York Times (30,000 visitors and 30 GB in a day) and the web host provider (bluehost) throttled the CPU in response. This slowed down the website considerably. What web host providers would offer a more scalable solution? Ideally I would like a high-quality host that charges by the GB and can handle bandwidth to expand during sudden slashdotting episodes without a reduction in performance.

    Read the article

  • How to disallow indexing but allow crawling?

    - by John Doe
    In the front page of my website, I have some previews to articles (with a small introduction to them) that link to the full articles. I want to disallow the front page to prevent duplicate content. But if I do this (in robots.txt), would it still be crawled? I mean, the full articles would be still reached by the crawler even though I disallowed the only page that links to them? I don't want the webcrawler not to access the page and enter the links in them, but I just don't want it to save the information (that will be repeated in the full articles).

    Read the article

  • Are you aware of any client-side malware that sends lots of junk requests for .gifs?

    - by Matt Sherman
    I am getting dozens of 404 errors on my site that are requests for gif's with apparently random names, like 4273uaqa.gif and 5pwowlag.gif. I see that most of them are coming from one user. I assume something is happening in the background on her machine without her knowledge -- a malware thing on the client. Have you seen this behavior before, and do you know what sort of malware might cause it? Would love to advise my customer that s/he has an issue. I'd also like to stop getting these 404 reports. (reposted from main Stack Overflow)

    Read the article

  • How to hold payment in paygate for a while?

    - by Fero
    Hi all, I have a query regarding holding the payment in PAYGATE PAYMENT GATEWAY. Here is the problem in brief. I am doing a website where the payment should be made only a certain members buy the product. For Example if there is an iPhone in my site, then that particular phone must be buy by certain quantities which given by admin. It may be done one by one user or a single user can buy all the quantities at a single time. In this case i need to hold the payment here.Because i don't want to receive the payments until the certain quantities bought. Because if certain quantities were not buy i need to refund the money to their account. We don't like to do this process. That's why we are looking for holding the payment. Is it possible or what is the best way to solve this problem? Please let me know what is you professional opinion? thanks in advance...

    Read the article

  • Projects to learn web development

    - by David McDavidson
    I'm trying to get a job as a web developer, but the great majority of jobs offers requires previous experience and a portfolio to prove you've got the required skills. Unfortunately I don't have any real experience or anything to show. The best way to learn is to try and tackle real world problems, so I'd like to know what would be some nice projects to learn stuff and that will look good in a portfolio?

    Read the article

  • Will Traffic exchange programs work in Pay per impression ad campaigns

    - by vinoth
    I'm having a technology blog with nearly 3000 visitors a day (1000 - 1300 unique visitors). I'm generating visitors through traffic exchange programs like hitleap.com, addmefast.com. Now i'm planning to go with some pay per impression programs. So far i came to know that google adsense won't allow using traffic exchange programs. Did all pay-per-impression programs will follow the same ? will i get banned using traffic exchange programs? Please reply me your suggestions, Thanks in advance. Sorry for my bad English.

    Read the article

  • Adwords: Is there a drawback to setting a really high CPC to learn what works faster?

    - by Rob Sobers
    I'm toying with increasing my max CPC really high on all my keywords so ensure my ad gets shown in the top spot on page one in order to draw more clicks. I think this will be a good way to quickly figure out whether the ads I'm writing have a decent CTR and, more importantly, whether the landing pages I'm building are converting. Since I can set a max daily budget for my campaign, I won't risk breaking the bank. I can't think of any drawbacks, personally. Am I missing any?

    Read the article

  • Why does 301 redirect work for http but not for https?

    - by Tom G
    Through my domain registrar I have set up a domain, essayme.co.uk, to automatically forward to https://google.com. If I go to http://essayme.co.uk it works as expected and redirects me to https://google.com. $curl -i http://essayme.co.uk HTTP/1.1 301 Moved Permanently Cache-Control: max-age=900 Content-Type: text/html Location: https://google.com Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sat, 07 Jun 2014 11:14:16 GMT Content-Length: 0 Age: 0 Connection: keep-alive However, if I go to https://essayme.co.uk it just freezes and times out. $curl -i https://essayme.co.uk curl: (7) Failed connect to essayme.co.uk:443; Operation timed out What is happening in the second case? (and, if possible, how can I get the redirect to work for https?) Problem background/clarification: I don't have an SSL certificate for the essayme.co.uk domain above, but I do for my live domain (let's call it mywebsite.com), and I was seeing the exact same problem on this domain (hence why I'm trying to debug the problem). Unfortunately I can't experiment with the live domain (as it's live) and I would like to avoid having to buy a second certificate for essayme.co.uk just for debugging (unless absolutely necessary). The problem I was seeing: my live domain, mywebsite.com (not its real name), has a valid SSL certificate. Visiting https://www.mywebsite.com displayed the webpage as expected. I had set up forwarding (like in the question above) from the naked domain (mywebsite.com) to https://www.mywebsite.com) Visiting http://mywebsite.com redirected to https://www.mywebsite.com as expected. However, visiting https://mywebsite.com would freeze and time out (as in the question above). I also tried forwarding it to http://www.otherwebsite.com as an experiment (i.e. forwarding to another site that does not use SSL), but the result was the same: Visiting http://mywebsite.com redirected to http://www.otherwebsite.com as expected. Visiting https://mywebsite.com would freeze and time out again. So I set up essayme.co.uk as an experiment to try and understand why it doesn't work.

    Read the article

  • What sort of phone numbers are allowed as the WHOIS contact?

    - by billpg
    I'm getting a non-trivial amount of scam phone calls to the phone number contact listed in WHOIS. Could I change it to a premium rate line? If the scammers want to talk to me so much, make them pay for the privilege! Seriously though, are there any restrictions on the type of phone number I can give as my WHOIS contact? Notwithstanding that it is a phone number which can be used to contact the domain holder. In the UK, cell phones are more expensive for the caller to call than land-lines, so I suspect a significant number of people are already listing a "premium rate" phone number.

    Read the article

  • What is a correct step by step logic of exporting scene with baked occlusion for loading it at runtime?

    - by myWallJSON
    I wonder what is a correct step by step logic of exporting scene with baked occlusion (Culling data) for loading that scene at runtime (on fly from the internet for example))? So currently my plan looks like this: I create prefabs Place them onto my scene (into Hierarchy) (say create 20 buffolows and some hourses and some buildings) Create empty prefab and drag all my scene objects from hierarchy onto it Export prefab So generally I put all my scene objects into one large prefab and export it but it seems that all objects that were marked as static get this property turned off when loading them at runtime and so no Frustrum Culling, and no Occlusion culling happens. So I wonder what is a correct way of exporting Sceen + Objecrts + Occlusion (and onther culing) data for future load of such scene at runtime? I wonder about current 3.5.2 Pro and future 4 Pro versions of U3D.

    Read the article

  • Multilingual website without language component in the URL

    - by user359650
    I'm working on a website for Canada which will have French and English versions. For SEO purposes, I would like to avoid using any language tag in URLs because I believe it will have more impact (e.g. example.ca/products better than en.example.ca/products or example.ca/en/products). I believe this is technically possible because the2 languages are sufficiently different that the URLs won't be conflicting with one another (e.g. if you want a "product" page, it will be /products in English, and /produits in French so you know which language the URL is about). Since Google (and most likely others) doesn't rely on the URL (nor HTML tags) to determine the content language I don't see any problems with search engines. To make this possible I've thought about using a cookie distinct from the session cookie (e.g. example.org_language) with long term expiry (e.g. N years) that will memorize the language chosen by the user. That way when people visit the website with a new browser session, they get served the proper language. I have already given up on users being able to switch one page from English to French: when people will chose English or French from the menu they will be redirected to the corresponding version of the home page. Do you foresee any problems with not using a language component in the URL (whether domain or path)? (as long as one makes sure URLS don't conflict).

    Read the article

  • Analytics - Where do my drop offs go?

    - by BadCash
    I have a website set up with Google Analytics (through the Wordpress plugin "Google Analytics for WordPress" by Joos de Valk). When I check out the visitors flow in Google Analytics, it shows something like this: (home) - 43% drop-offs /page-2/ - 10% drop-offs ... etc ... I have also set up events for external links. My main "goal" of the website is to drive traffic to my Android app on Google Play, so I have a couple of different links to that that are all set up as events. Everything seems to be working, my events show up when I go to Content - Events in Google Analytics. However, it seems to me that some percentage of the users that are reported as "drop-offs" in fact have clicked on one of the external links. But there's no info about the reason of those drop-offs in the Visitors flow-chart. I can of course check out each specific event category, event action and set "other" to Content/Page, which (I guess) shows the number of visitors who triggered a specific event on a specific page. It just seems like such a complicated way of going about this! So, is there a way to get a more detailed picture, including events, in the Visitors flow chart? Something like: (home) - 43% drop-offs Event Action: "Google Play"=50%, "Youtube"=10%, (not set)=40%

    Read the article

  • WUXGA revisited

    - by John Paul Cook
    I previously blogged about my search for a 17” 1920x1200 laptop. The only one I could find was a 17” MacBook Pro, which has been an excellent machine for running Windows and SQL Server. It is no longer made. Apple has a few refurbished ones available. Just be sure to get a matte display if you buy one. If you want WUXGA resolution or better in a laptop, your only off the shelf option is now the 15” MacBook Pro with the Retina display, which is 2880x1800. This exceeds the resolution of my 30” 2560x1600...(read more)

    Read the article

  • What is the best way to promote a programming blog?

    - by paul
    (The guys from 'Programmers' referred me here...) How do you promote your programming blog? I recently started http://blackforestcoder.blogspot.com/ to record my progress working with new technologies and ideas. The main aim being to provide a list of pitfalls and solutions and also to get feedback from readers. Since I set it up 10 days ago I have only had about 2-3 hits even though Google is supposed to be indexing it. How might I boost the hit rate?

    Read the article

  • Cropping images & SEO

    - by user1181950
    So I have a page with a bunch of images with largely varying sizes. Also the layout of the page is such that the images are all in the shape of square tiles, so just resizing will cause distorted images. What I've been doing previously is when users upload images, I resize and crop them appropriately and display the new image as the thumbnail and load full image when user clicks on it. However, I just realized this is an issue with SEO as google will crawl the thumbnails and stick the thumbnails on Google Images instead of the full images. Is there any way to show a cropped/resized image but have Google Image show the full image? I can do something with css using an enclosing div and overflow:hidden, but I'd imagine the performance on that would be pretty bad. Any suggestions? Thanks! PS. I saw this (Make google index the actual image not the thumbnail), but in my case I have users continuously uploading images, and the database of images is always changing and pretty big (thousands), so sitemap will be pretty unwieldy..

    Read the article

  • Can mass different log-in pages result in SEO duplicate and/or low quality punishments?

    - by Noam
    I have internal pages that rely on an external API which I would like to build upon user request. Two options I thought about: Make lots of 'thin' pages that specifies that if you want content about X, you need to log-in, and then the page will be built. Pros: user understands what he'll get when logging in. Cons: SEO implications of such a solution due to the mass 'low quality' and 'cross-sites duplicate content' Make them all redirect to ONE same generic log-in page. Pros: No duplicate low quality content. Cons: Lots of internal links to the same log-in page. Which would you recommend?

    Read the article

  • Website with sections in Drupal?

    - by Matt Hampel
    What is the best way to create a website with sections in Drupal? Users need to be able to add, remove, and nest pages fairly easily. Pages added to a section should have an appropriate URL, like "/[section name]/[page title]". This seems like a straightforward task, but I can't find the right combination of tools to do it. Subsite comes close, but for some odd reason, doesn't set up the correct content paths. The closest I got was creating a book for each subsection, but that feels like I'm using the wrong tool for the job. Edited with my solution: I used organic groups with pathauto. I set pathauto so that pages in groups had URLs that were of the form [group path]/[page title].

    Read the article

  • Why is Joomla based website that was copied off of live server into localhost not showing pictures and throwing 404 error?

    - by Darius
    I have copied Joomla based website via FTP onto my machine and I am trying to make it run on my localhost which is provided by the latest version of XAMPP. I have exported and imported the DB with no problems. I have placed all the files and folders into htdocs folder but when I go to localhost/examplesite all I get is the text that is on the front page but no pictures and it displays 404 Error. Do I need to make changes to .htaccess? If so, can some one point me to the right direction? Thanks

    Read the article

  • Redirect pages to fix crawl errors

    - by sarah
    Google is giving me a crawl error for pages that I have removed like www.mysite.com/mypage.html. I want to redirect this pages to the new page www.mysite.com/mysite/mypage. I tried to do that by using .htaccess but instead of fixing the problem, the crawl pages increased and a new crawl came www.mysite.com/www.mysite.com. This is my .htaccess file: <IfModule mod_rewrite.c> RewriteEngine On RewriteBase /sitename/ RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /sitename/index.php [L] </IfModule> # END WordPress Should I add this after the rewrite rule or I should do something else? RewriteRule ^pagename\.html$ http://www.sitename.com/pagename [R=301]

    Read the article

  • Fake links cause crawl error in Google Webmaster Tools

    - by Itai
    Google reported Crawl Errors last week on my largest site though Webmaster Tools. Here is the message: Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site's pages. The Crawl Errors list is now full of hundreds of fake links like these causing 16,519 errors so far: Note that my site does not even have a search.html and is not related to any of the terms shown in the above image. Inspecting sources for one of those links, I can see this is not simply an isolated source but a concerted effort: Each of the links has a few to a dozen sources all from different, seemingly unrelated sites. It is completely baffling as to why would someone to spending effort doing this. What are they hoping to achieve? Is this an attack? Most importantly: Does this have a negative effect on my side? Could it negatively impact my ranking? If so, what to do about it? The few linking pages I looked at are full of thousands of links to tons of sites and have no contact information and do not seem like the kind of people who would simply stop if asked nicely! According to Google Webmaster Tools, these errors have appeared in a span of 11 days. No crawl errors were being reported previously.

    Read the article

  • Magento checkout with Paypal

    - by jplozanojuan
    I've setup paypal on Configuration Sales Paypal a thousand times, but still not getting the Paypal option listed in the Payment methods on the Checkout process. And yes, I have filled out all the required info like the API credentials and such. Also, I'm not getting any errors from Magento at all. This is driving me crazy and it seems that no one (as far as I see) has gone throught this situation. Thanks in advance.

    Read the article

  • SEO blog Indexing: wordpress.com subdomain vs a registered domain?

    - by rumspringa00
    I've used WordPress for a few of my client's sites, mostly small businesses and eCommerce sites. I have found through Google Analytics as well as the All in One Webmaster plugin that when it comes to social media, using WordPress is a surefire way of getting your site indexed by Google and occasionally Bing and Yahoo. Since I am a heavy WP user, I'd like to contribute by registering a dot WordPress domain for my portfolio. When using a WP installation concurrently with a WP domain, e.g. myportfolio.wordpress.com, will the site be more or less likely to be indexed rather a generic myportfolio.com domain? I've seen mixed opinions where people seem to favor a WP domain for URL output where others say that it's a moot point, and that Google will not favor a WP domain over a dot com domain as long as your meta tags are updated and content is keyword optimized. I tend to disagree and believe a WP domain would more likely be indexed and output more URLs over an individual, laconic domain like myportfolio.com. Am I wrong?

    Read the article

  • Indexing and Page Ranking Issues

    - by user631249
    Hi all I am on the first page of google for keywords concerned with MOVING, however i cant seem to break the carpet cleaning rankings. I have made changes and additions which havent been indexed yet. Should i wait for the run or please please can someone give me pointers on the carpet cleaning indexing. Also i have 53pages submitted and only 38 indexed, where could the problem be. Is there software to check indexing hiccups . Thanks.

    Read the article

< Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >