Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 285/592 | < Previous Page | 281 282 283 284 285 286 287 288 289 290 291 292  | Next Page >

  • Determining cause of random latency/loading issues

    - by Sherwin Flight
    I'm not sure exactly what details to post in regards to my issue, because I'm not sure what is relevant. Prior to the end of September my websites all loaded quickly, in almost all cases. Loading time wasn't usually more than a few seconds. However, since the end of September I noticed a big increase in page loading times. In some cases pages were taking 30 seconds or more to load. I do have a remote monitoring service monitoring some of the sites as well, and the image below shows the response times over the past month. The response times shown at the beginning of this graph were what the usual response times were prior to this issue occurring. You can see that there has been a significant increase in response times from the beginning to the end of this graph. The thing is, the problem is not happening 100% of the time. If I click through the site, or even just keep refreshing the page, about 25% of the time the pages load quickly, the remaining 75% of the time they load slowly. Sometimes the pages take so long to load that they time out, and don't load at all. I have contacted my hosting provider, and they said things at their end was fine. I don't believe the problem is my home internet provider, because all other websites load without a problem. The server is located in Texas, USA. This also raises another interesting point. My remote monitor checks my site from two locations, California, USA, and London, England. As you can see in the chart below the response time is actually shorter when checked from London, which doesn't seem to make sense, since the server is physically closer to the California monitoring location. I would have expected the London monitoring location to have higher response times since they are physically farther away. I should also point out that in some traceroute test I've done it seem like the first connection to the server seems to take the longest, then after that the rest of the page loads quickly. Below is a little chart showing the times for the first connection to the server. So, what could be causing this problem, and what steps can I take to resolve it or at least narrow down the problem? Sending the request to the server was very quick, and receiving the reply back seems pretty quick, but the WAIT time is really long. So it connects, sends the request, but then waits close to 30 seconds before it starts receiving data back. I am also aware that there are things I can do to speed up page loading times, like reducing the number of css/js files used on a page, compressing images, etc. This is not really what the source of the problem is though, because nothing has really changed on the site since before the problem started, and other sites on the same server are loading slowly as well. Any help or advice is much appreciated.

    Read the article

  • How To Track "Similar Product/Page" Links In Internal Site

    - by Petra Barus
    So I just created a new widget that would show up in a product page in my site. This widget will show several products similar to the product that is displayed in the current page. The purpose is to help users compare similar products. Let's say in the product page A http://domain/products/A The Similar Products widget will show http://domain/products/B http://domain/products/C http://domain/products/D http://domain/products/E My question is how to track this "Product B page were visited X times from Product A page via Similar Product widget"? (And there is also chance that Product B will show up in the widget on Product C page) I have this idea using the Event feature from Google Analytics. But I'm still not sure if it is or what is the common best practice for this.

    Read the article

  • Homepage not showing on Google

    - by MIke Mayberry
    About six weeks ago my homepage (mayberrykayakingdotcodotuk) disappeared from the google organic search for "kayaking pembrokeshire" despite it having been number 2 within a few weeks of it's launch last summer. My previous site (www.mikemayberrykayakingdotcodotuk) had been 2nd for about six years and has 301 redirects for all pages to the new site. Google toolbar still rates the homepage as 3/10 and the domain is still showing in search results, just not the homepage. A little research suggests that this is most likely to be due to an issue with google treating two pages as identical content (one with www. and one with not) since the changes in their algorithms around that time and that the way to fix this is to add some code somewhere. This makes sense to me as my print advertising doesn't have the www part of the address. I have cpanel access but a limited knowledge on web coding, having picked things up as I've gone along and paid for designers etc., when needed. Would someone be able to let me know where I have to go to add the code and what code I need to add to redirect the crawlers to one page? Or is there another issue that is causing this? Thanks in advance.

    Read the article

  • Need a really simple client manage script to deliver graphics and revisions, please help?

    - by Mark R
    I am looking for a very simple client management script. The process flow of the script should be: Client orders (paypal etc) while giving specs on what they need given login details and thanked for their order backend for them consists of a 2 way communication. They ask questions we answer. We also upload the graphics here where they either accept them or as for revision. process complete. Now I cannot for the life of me find something as simple as this. It seems all the scripts out there are way too complicated. Does anyone know of one I can use to do this?

    Read the article

  • Google Analytics goal funnel does not recognize virtual page views

    - by Webber Smith
    I have a setup wizard with 3 steps. Since I'm using AJAX each step uses a virtual pageview with an appropriate URL for each step (see below). The pageviews are being recorded in the Content section of Google Analytics but the Goal Funnel still shows zero for each step. I've tried advise on other forums such as... Make sure Goal URL is set to Exact match Make sure no steps or the Goal URL are a parent directory of any other steps. For example, don't track /wizard/ as a Goal/step and track /wizard/step2/. Not sure why this would be a problem since it is an exact match, but it shouldn't hurt so I tried it... Require (or don't require - tried both) the first step in the funnel ...but none of these seem to work. Thoughts? Goal Settings Exact match : "/wizard/setup-complete/" Funnel Step 1 : "/wizard/step1/" Step 2 : "/wizard/step2/" Step 3 : "/wizard/step3/"

    Read the article

  • what is the preparation of before apply for ads?

    - by cj333
    I have built a new site(finished last week, but no SEO, no clients). I know apply for ads need a long time. So I send my request to google adsense, and then do the SEO, send my site to web search. but yesterday I received an e-mail: Thank you for your interest in Google AdSense. Unfortunately, after a consideration of his request, showed that the It is not possible to accept your application to attend AdSense. what is the preparation of before apply for ads? Am I ignore something must do before apply for ads? And how to apply Google doubleclicks? Or other ads service? I am newer for apply for ads. can anyone recommend me some good ads service? Thanks.

    Read the article

  • subdomain not working and added /mysubdomains/devsitename

    - by krish
    I am having a site www.example.com which working fine and I have a number of sub-domains which are working fine except one. When I gave the url subdomain.example.com the address bar showing as below subdomain.example.com --> www.subdomain.example.com/mysubdomains/devsitename It added the www and the /mysubdomain/devsitename which is my hosted directory in my server. Then it came up with the website you were looking for is unavailable. Has anyone experienced this issue? Do you know how to resolve this?

    Read the article

  • Working out costs to implement WCAG 2.0 (AA) site

    - by Sixfoot Studio
    Hi, I've run our client's site through a WCAG 2.0 validator which has returned 415 tasks that need to be worked through in order to get it WCAG 2.0 compliant. For the most part, I can get a rough estimation of how long a task will take but there are tasks I have never had to do before which I am not sure how to cost. I would like to know if someone has a rough guide on what to cost a client to convert their site to a compliant WCAG 2.0 (AA) site. Many thanks

    Read the article

  • Using cracked software and tools [closed]

    - by Lena Aslo
    I am seeing people complaining about expensive tools such as Dreamweaver or Photoshop. I am just wondering about that, because everyone knows that they can get this software running for free (if it is done illegally). Why don't they just use a cracked version? Is it so likely to get caught? I feel that nowadays a lot of people are using cracked software but whenever the topic is mentioned, they ALL say PSSST!!! or start criticizing it, even though they are doing it themselves...

    Read the article

  • Magento Default Sitemap.xml

    - by chipShot
    Is the default magento sitemap.xml optimized as is for ecommerce products? I'm thinking about adding image links as well. Is it worth time investing in this for SEO gains? <url> <loc> http://demo.com/product.html </loc> <lastmod>2011-08-03</lastmod> <changefreq>always</changefreq> <priority>1.0</priority> </url>

    Read the article

  • auth_mysql and php [migrated]

    - by user1052448
    I have a directory with auth_mysql in a virtualhost file password protected using a mysql user/pass combo. The problem I have is one file inside that directory needs to be accessed without a user/pass. Is there a way I can pass the user/pass within the php file? Or excluse the one file? What would I put between the code below? <Location /password-protected> ...mysql password protection require valid-user </Location>

    Read the article

  • Different domain for dirrenet thing or just one?

    - by Mahdi
    Suppose I'm starting my business my major is computer services like: graphic , programming, computer repair , networking and.... now the question is, what do you recommend for a better ranking? should i have a separate domain for each of these field or i can have them all in different pages/categories in one website? my preferred CMS system is Wordpress. and...do you recommend me using keywords in domain name even if it becomes hard to remember, meaningless and long? Thanks

    Read the article

  • Should I pass link juice to my pages on other websites that are already high PR domains?

    - by huzzah
    I am starting a new website for a local business and have entries listed for it on places like urbanspoon, yelp, google+ local, etc. I am thinking of listing these citation sites on my business website to encourage visitors of my site to go and review the business on those sites. If I dofollow I will pass link juice to my page on that site, but doesn't that mean that the very very little PR juice I have will be leached away from me? Is it better to nofollow them?

    Read the article

  • SEO and external sites that serve responsive images (like Re-SRC)

    - by Baumr
    Re-SRC is a tool that allows you to automatically serve responsive images for your website from their cloud servers. It delivers a new image file each time the browser window (viewport) is resized. To use it in your HTML when linking to an image, you would do the following: <img src="http://app.resrc.it//www.your-domain.com/img/img001.jpg"/> Some more background for SEO considerations: As an example, looking at their demo page's code, the src of the Arc de Triomphe photo — when the browser window is resized to be at a tablet-width — shows this particular file at it's widest. It is found under the following URL: http://app4-uk.resrc.it/s=w560,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg If the viewport is increased to desktop-width, then a smaller image is served in line with the design; see this URL: http://app4-uk.resrc.it/s=w320,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg If I change the viewport to be about half-way between those two, then the image's URL is: http://app4-uk.resrc.it/s=w240,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg In other words, I found that there is a separate file for every 10-pixel increment of the image width. Very cool for saving bandwidth on mobile devices and service responsive/retina images on others, but... Here are two problems I see for SEO: The img on your site, part of your semantic markup, will not be hosted on your site at all, or even a server you control. Any links to these images will pass on "link juice" to Re-SRC's site instead. You are serving a vast array of different image files to different people — some may link to one, others to another size. Then there's the question of what different search engine crawlers will see. Also: There seems to be no fallback option if their servers are down. Do you see any other concerns? Or, perhaps, do you not see those as concerns?

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • Adding your website to free web directories as a link building strategy

    - by Man
    It's been two month I've launched a website. I recently ran into some websites which list directory of other websites. Some examples can be http://www.addgoogleurl.com/ and webdirectorieslist.com, etc. I was talking with my colleague and he says, adding the URL of my website to these kind of websites will negate the effect of other organic real links. Does google consider positive/negative points for these kind of links from web directory websites? Do you have any source for your answer to refer to? I found this question asked before on webmasters.SE, this is asking about many links from a website.

    Read the article

  • Google Analytics: How can I traffic and referrals from iPad applications?

    - by kayaker243
    In Google Analytics, there is extensive information on the mobile device, version and browser version. However, this doesn't seem to go beyond the mobile browser. I would like to determine which application is responsible for visits to my site. Specifically, I want to know how many visits are coming from zite. http://www.handsetdetection.com/properties/vendormodel/Apple/iPad/page:4 seems to indicate this information is probably available, where/does Google Analytics expose this?

    Read the article

  • Disable outbound links without letting others know that

    - by tadoman
    Is there a way I can tell google not to follow external links ( pointing to other sites) without letting other know. I know you can disable outbound links by putting rel=nofollow or something in robots.txt. But that's something others can see as well. I'm just wondering if there's a way to tell google not to follow those links without letting others know that... like a setting in webmaster tools or something similar ( there's definetly one way. I could set an exception in my conf file for my server to check the user agent to be "googlebot" and then serve a different version of robots.txt. So that when a different user would check that link it would return a different robots.txt thant the one served to googlebot. However I'm not too sure google would be too happy about this) Thank you

    Read the article

  • Blogger homepage won't update!

    - by Sims Siniron
    i am new on blogging webmaster Tools. When i usually add new post to my blog, it will automatic updated my homepage also. But from last 14th January, my homepage won't update by Google SEPR. As a result i am losing my popularity on SEPR. Previously when i post new article, 70-80% will go to the first page result. But after the problem occurs, none of them reach in top 15page of Google SEPR :( Last 1/12/12, Google webmaster sent me a "Notice of DMCA removal from Google Search" massage to indicates one of my URL that contains some infringing content which i deleted after receiving their notice. Not only that, i also cheeked all of my posts if there any additional infringing content available. After removing that, i fill out Google's Content Removed Notification form to notify them and Google also sent me a feedback that they received it and suggest "In the future, if you have removed the allegedly infringing content from your site (and won’t put it back), please use the correct form" which also i filled up. Now my question is that, Is everything alright which i did before? Although my new posts are indexed in GSEPR with ".." but why Google Robots.txt won't update my homepage which previously automatically updated when a new article was published.

    Read the article

  • Recommend an open source CMS for single page web site

    - by RedMan
    Hi I want to create a single page web site like http://kiskolabs.com/ or http://www.carat.se to display my portfolio. I want to add new products after launching the site without having to edit the entire site. I've looked at opencart (too much for single page site), Magneto (more for ecommerce), Wordpress (couldn't find open source / free templates which i can start from). Can you suggest a CMS which will support the creation of a single page site and allow insertion of new products without having to edit the entire page? I would prefer a CMS which also has open source / free templates which I can tweak for my use. I can do php and mysql, xml. If it is an easier option I can do PSD to site (but don't know much about this at all).

    Read the article

  • Random links seo and spam

    - by DoesNotCompute
    I built a mini-forum with social features for a client, to promote user registration, i planned to add a box on the forum pages to display pictures with profile links of random registered users. I managed to make this random selection static for a day, i mean the list will be renewed each day and not change on every page refresh. Could this random list of link could be harmful to seo by being considered as some kind of spam?

    Read the article

  • Open source login solution

    - by David
    Authentication is such a general problem, which most websites have to implement. There are a few commercial solutions, but all lack sufficient functionality to customize the registration process. Therefore, I am looking for an open-source alternative. I am using PHP and with PostgreSQL as database, but as far as I understand one could utilize authentication solutions using other technologies and integrate them into our site in various ways. Therefore, I am looking for such solutions in any technology apart from those requiring Microsoft infrastructure... I would prefer Open Source solution, which have already implemented the following features: Has password recovery procedure Username is the email address of the user Has "Remember me" functionailty (meaning that the user is logged in automatically without seeing the login page) email address verification Google has gotten me nowhere on this and neither a search on this site...

    Read the article

  • Add copyright notice to a website

    - by PeeHaa
    Not really a programming question, but I find it related. If not (or you find this question too subjective), please tell me, yell at me, swear at me, kick me in the nuts, or just vote to close :) I've read some questions and answers here on SO about adding copyright notices, but not the specific ones I am looking for. I want to add a copyright notice to a website I created. Something like (c) Me 2010. All rights reserved. I am aware that everything written by someone is automatically copyrighted (if I'm not mistaken and perhaps depending on country laws). I see some sites use the following format for this (c) Me 2009-2010. However for me that makes no sense to add an 'end-date' to the notice. I am aware I can code to update the notice every year, but I just find it strange. Or is it just me? Another question is: I also use copyrighted code from others (they are all mentioned in the credits incl. links to their licenses ofc) on my site. Would it still be OK to add the copyright notice to the site with only Me in it? So to sum it up I have 2 questions: What is THE RIGHT WAYTM of adding a copyright notice on a website (or code or whatever)? If there is one. Is it allowed to copyright code with other copyrighted code within it?

    Read the article

  • Legal responsibility of public posts

    - by Murdock
    Given a public site with no logins: I let people post links to public Facebook profiles, and my site fetches the profile picture and displays it. Would it be ok if I just told people to post profiles of which they had the owner’s permission? Does such a statement exonerate me from copyright infringements and place the burden on the user? Edit: For bonus points. Can the statement just be a notice under the button (that will save the link) that says that "By clicking this button you agree to the terms and conditions" with maybe a link to the terms and conditions.

    Read the article

  • Add rules (filters) to ftp programs to avoid uploading certain files/folders

    - by guisasso
    i use Filezilla as my ftp client, but this question goes to any other client that could be useful. Can i (in any client) add rules (filters) to an ftp program to avoid upload of certain files or folders? For example: Expression web creates those annoying _vti_cnf folders, or, certain folders in which i have the original version of a picture without a watermark that i don't want to upload. Example, i have a folder A, that has sub folders "original" and "current", i would like to add a filter, so every time i select A to be uploaded, "original" wouldn't go, but "current would".

    Read the article

< Previous Page | 281 282 283 284 285 286 287 288 289 290 291 292  | Next Page >