Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 114/389 | < Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >

  • .com vs .me for personal and blogging sites. Which one is good regarding seo

    - by Sameer Manas
    I basically have a domain under my name with .com extension. I am planning to use it for my portfolio and also as a regular blog. Now considering SEO and ranking stuff, what is the best way to implement this. myname.com - Portfolio || myname.com/blog - Blog page (or) myname.com - Blog || myname.me - Portfolio i have absolutely no idea on how .tld's impact SEO and Ranking, so i seek the experts advice on this. Thanks in advance.

    Read the article

  • Trade off: Lower the number of URLs in sitemap from 43k to 23k or update the sitemap.xml only weekly basis

    - by Tobias
    we rewrote the sitemap creation process. Now the sitemap contains 43.000 URLs. 20k more than before. We have daily changing in URLs. The script that is creating the complete sitemap takes more than 30h. So we can not build it every day. Lets say that increasing the speed of the script is not possible. What should I do? A: Stay with the 23k URLs and update it daily B: Increase number of URLs to 43k and update it weekly

    Read the article

  • How to point GoDaddy to EntryDNS domain

    - by geminiCoder
    I have a server connected via dynamic IP. I have set up EntryDNS to manage the change of my IP. If I put in my EntryDNS URL it points me to my server's current IP. I purchased a domain from GoDaddy, but I have been unable to get it to point to my EntryDNS. What I want is to be able to ssh to my server, but ideally I'd like to do this by using my domain name. I must confess I'm a bit overwhelmed by the GoDaddy interface. So the bottom line is, how do I point my GoDaddy domain to my DNS domain so that when I look up the domain I get the current IP of the server?

    Read the article

  • Nginx routing script for NodeJS and Wordpress

    - by Nilay Parikh
    We are moving blogs and site from wordpress to nodejs and ready to move into production. However I'm not able to figure it out how to implement routing from front server (Nginx) to NodeJS (prefered web instance) and if data not synced yet into NodeJS website than (404 will throw by NodeJS) fall back to (using reverse proxy) to Wordpress and serve page, during the transformation period. Q1. Is the approach good for the scenario, or anyone can suggest better approach? Q2. Should NodeJS treat itself as Reverse proxy (using bouncy : https://github.com/substack/bouncy or similar package) in event of fall back or shoud stick with Nginx to do so using fastcgi approch. Both NodeJS and Wordpress are on single server only, In first scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then reverse query wordpress and serve content second scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then 404 to Nginx and Nginx script fallback to Wordpress (FastCGI PHP) Later we have plan to phase out Wordpress and PHP from the server environment completely. I'd like to see any examples of Nginx or Varnish scripts and/or NodeJS scripts if you have for me to refer. Thanks.

    Read the article

  • I'm using a shared server, and as such Gmail marks my email as spam (all from headers are different from the same IP)

    - by chipperyman573
    I have a shared server, meaning many people share the same IP. When I send an email, the @website.com is different from someone else that shares the same IP with me, therefore Gmail marks it as spam. For example: My website's IP is 1.2.3.4. My website is mywebsite.com Person 2's website's IP is hosted by the same host, and as such their IP is 1.2.3.4 Person 2's website is person2.com. When they send an email, it gets sent from [email protected] When I send an email, it gets sent from [email protected] According to Gmail's spam thing: "Use the same address in the 'From:' header on every bulk mail you send." Again, the only similarities between our websites is the IP. However, this causes Gmail to mark both our mail as spam. Is there a way to sort this out with Gmail?

    Read the article

  • Cost effective way to provide static media content

    - by james
    I'd like to be able to deliver around 50MB of static content, either in about 30 individual files up to 10MB or grouped into 3 compressed files, around 5k to 20k times a day. Ideally I'd like to put some sort of very basic security around providing the data to ensure that a request is from the expected source, but if tossing the security for a big reduction in price is possible then it's an option. Does anyone have any suggestions other than what I've found: Google AppEngine is $0.12/GB & I believe has a file size limit of 10MB so I'd have to break the data up a bit. So a rough calculation would seem to be that this would cost me about $30 to $120 a day. Or I've seen something like what seems to be just public static content delivery with no type of logic capabilities like Usenet.nl at what I think calculates to about $0.025/GB which would cost me about $6 to $25 a day. Any idea if I'm going about these calculations right & if there might be a better option for just static content on a decently high volume delivery? Again some basic security would be great but if cost is greatly reduced without it then I'm up for that.

    Read the article

  • apache domain redirect to subfolder

    - by Dennis
    I have a hosting account with godaddy. Its a linux system running apache. The way they do their setup is your primary domain is the root folder. When you add a subdomain its in a subfolder of the root which sucks. I want to setup a subfolder structure to organize my domains.. I called godday support and they said to use redirects.. but did not know how to do that.. How its setup now: primary domain: www.domain.com / sub.domain.com /sub I want to create a directory structure and then redirect to each but only show www.domain.com in the url www.domain.com /domain/www sub.domain.com /domain/sub I tried using: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?domain.com$ RewriteRule ^(/)?$ domain/www [L] but it just changes the url to www.domain.com/domain/www Can this be done in htaccess?

    Read the article

  • Google not recognizing microdata? [duplicate]

    - by user1795832
    This question already has an answer here: How long for data highlighter mark up to appear in structured data tool? 2 answers I put in microdata to one page of a site I help manage using schema.org. Using the Google webmaster tool test, the page checks out and displays what it sees as the microdata properly. But when I go to the Structured Data page in webmaster tools, it keeps saying the site does not have any. I put it in 2 weeks ago. Us it just something that take a while for it to recognize? Or does microdata have to be on every page for it to be recognized or something?

    Read the article

  • should I learn html/css before php even for using database? [on hold]

    - by Sadegh
    I saw lots of question about this topic and all of them were talking if someone want to use php for "building web pages", should learn html first or not. and most of them said yes, because most of the time you make web page with both php and html (and maybe css). But If I just want to use php for contacting to My Database (for example MySQL) and nothing more, shuold I learn any html or CSS first or not?

    Read the article

  • Making own clothes website [on hold]

    - by Manjushree
    I am BSc student in Mathematics but i would like to create own clothes website. Can anyone help me how can i design the clothes website. I never have any background knowledge about making the webpage online. The clothes website does not have to look professional but simple enough where i can put my clothes to show the items to people or customers. Once I created the clothes website then i can open the business account and starting selling the goods online with that account. Do i need to buy any domains to create the website? Please help me?

    Read the article

  • How can I decrease relevancy of Creative Commons footer text? (In Google Webmaster Tools)

    - by anonymous coward
    I know that I may just have to link the image to make this happen, but I figured it was worth asking, just in case there's some other semantic markup or tips I could use... I have a site that uses the textual Creative Commons blurb in the footer. The markup is like so: <div class="footer"> <!-- snip --> <!-- Creative Commons License --> <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/80x15.png" /></a><br />This work by <a xmlns:cc="http://creativecommons.org/ns#" href="http://www.xmemphisx.com/" property="cc:attributionName" rel="cc:attributionURL">xMEMPHISx.com</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. <!-- /Creative Commons License --> </div> Within Google Webmaster Tools, the list of relevant keywords is heavily saturated with the text from that blurb. For instance, 50% of my top-ten most relevant keywords (including the site name): [site name] license [keyword] commons creative [keyword] alike [keyword] attribution [keyword] I have not done any extensive testing to find out rather or not this list even matters, and so far this doesn't impact performance in any way. The site is well designed for humans, and it is as findable as it needs to be at the moment. But, out of mostly curiosity: Do you have any tips for decreasing the relevancy of the text from the Creative Commons footer blurb?

    Read the article

  • Please explain some of the features of URL Rewrite module for a newbie

    - by kunjaan
    I am learning to use the IIS Rewrite module and some of the "features" listed in the page is confusing me. It would be great if somebody could explain them to me and give a first hand account of when you would use the feature. Thanks a lot! Rewriting within the content of specific HTML tags Access to server variables and HTTP headers Rewriting of server variables and HTTP request headers What are the "server variables" and when would you redefine or define them? Rewriting of HTTP response headers HtmlEncode function Why would you use an HTMLEncode in the server? Reverse proxy rule template Support for IIS kernel-mode and user-mode output caching Failed Request Tracing support

    Read the article

  • What is the proper way to setup Google Apps email accounts for a subdomain?

    - by binaryorganic
    Let's say I've registered domain.com at enom and set it up to use Google Apps for email by rerouting DNS to enom's servers and editing the MX records there. That works flawlessly. Now let's say I want to have email at a subdomain for that same site. I already have a working subdomain at the host, but I want to catch email traffic at enom before it gets that far. I've set up Google Apps as a new account for the subdomain, successfully verified domain ownership, and now they want me to update MX records. What's the right format? For domain.com, I just put @ for the hostname, and then provided the Address and Pref values that Google gave me. I tried putting subdomain.domain.com as new values under hostname for the subdomain, but that doesn't seem to work. What am I doing wrong?

    Read the article

  • How to promote/market an event that needs many people?

    - by stjowa
    My team is about to launch a new web application, http://wethepixels.com, that requires a lot of people to be on the site at the same time for the concept to be successful. Our team is preparing to promote/market an event for a specific date and time, in order to try to grab a large group of people to the site at once. For those who have gone through a similar web launch, we would love to hear ideas on the best way to market for a large group in a relatively short period of time. We have created a Facebook page and a Facebook event, but it has yet to grab much attention (surprisingly to us). Is there a better way to attract a large number of users in a short period of time? Thanks

    Read the article

  • webmaster tools - Network Unreachable

    - by Jayapal Chandran
    Hi, webmaster tools for my site displays that robots.txt unreachable and for all links in sitemap it says network unreachable. sitemap.xml unreachable. These appear in crawl stats page. I discussed with the support team of my hosting and they said... Hi, I have verified apache logs, i cannot see any issues on your website/webserver/ Possible issues. There may the routing issue from the googles server to our server. When a google bots hits goes high the IP will be automatically blacklisted by our firewall to avoid server loads & downtimes. As we donot have access to their services, We cannot able to give details of their details/logs etc. The sitemaps link shows an exclamation mark which means the file was not reachable. What could be the problem and how to solve it?

    Read the article

  • How to effectively use an overseas SEO team?

    - by Dan Gayle
    My company is currently in contract with a 20+ person team in the Philippines, previously used for comment linking and guest blogging spun content articles. This is a practice that we're stopping, but we don't want to sever our team because they work hard, they're really cheap, and they produce excellent accounting and reporting of their actions. What are ways that we can best put them to use as a link generating or content generating resource? Their English is fair, but not of high enough quality to use them for any direct content creation. Thanks

    Read the article

  • How do you set up the directory structure for a multilingual site without duplicating content?

    - by Ricardo
    I want to make a website in two languages. I've looked around and settled on the directory option of separating both languages. How do I make it work? Let's say I have the following three files for the landing homepage, the English page and the Spanish page: http://www.domain.com/index.html http://www.domain.com/en/index.html http://www.domain.com/es/index.html Let's also say that /index.html will be in English, with a link to /es/index.html. In turn, /es/index.html will have a link to the English version. Would this be back to /index.html or to /en/index.html. How do I get both English versions (the one at the root and the one in the directory) to actually be the same file in the same directory? I'm new to this, so I'm not using any scripts yet. To me, the obvious solution is to duplicate both English versions and have the one at the root point to files under the /en/ directory, but I'm not a fan of duplication and I've learned that search engines really frown upon that. Anyone point me in the right direction?

    Read the article

  • How to Estimate Needed Bandwidth for New Web Application?

    - by Noah Goodrich
    I am working on a brand new SaaS web application and need to estimate the initial bandwidth usage. Since the site doesn't exist yet, and since this is my first endeavor of this sort, I'm not really sure how much bandwidth to estimate to begin with. We will be using Linux, Apache, PHP and Mysql. The content will be generated dynamically. There will be images as part of the site design but user's will also upload images that will be displayed and documents that will be stored for download at later times. We'd like to be able to support 500,000 page loads per month with estimated image loads being about two to three times that.

    Read the article

  • What's better for SEO for many international markets?

    - by Roy Rico
    Right now, we're working to migrate our company sites for international markets to this scheme www.company.com/[2 letter country codes] www.company.com/uk #for United Kingdom www.company.com/au #for Australia www.company.com/jp #for Japan www.comapny.com/ #for united states, and non identifiable. However, in google webmaster tools, we can geo target each directory, but not the root. If we geo-tag the root with US, all the other markets will inherit. Is it better to move the US market to /us/ or leave it where it is?

    Read the article

  • two <select> always next to each other inseide <td> ? [closed]

    - by Radek
    I have to selects inside td and I want to make sure that they are next to each other at all times but td's width is width of these two selects. Not more. The thing is that value to be displayd in selects changes based on data. <td> <select name="db2.rfthdd"> <option value="WEI">WEI</option> <option value="SCOTSdatabase">SCOTSdatabase</option> </select> <select id="db2.rfttimestamp"> <option value="20110302122831">2011-03-02-122831</option> <option value="20110302122442">2011-03-02-122442</option> </select> </td>

    Read the article

  • How should I deal with user agent parsing in logs?

    - by Mr. Jefferson
    My web app project includes logging functionality so we can see where visitors are coming from (referrer URL), what the popular user agents are, what pages are most popular, etc. The log is stored in SQL Server, and when I query the user agents I use a large (almost 100 lines) and growing CASE statement to separate the user agents using string matching (i.e. if the user agent contains the string "Firefox/9" then it's Firefox 9). Is there a better way to do this so I don't have to continually add to that CASE statement to deal with new browser releases? Also, how should I deal with less common, weird/unknown user agents? I've seen the following in the logs and been unable to find good information online about what they are: WordPress/3.3.1; http://www.facecolony.org Mozilla/4.0 ( http://www.hairirons.org redips; <a href=http://hairirons.org/>chi hair iron</a>) I'd guess they're bots/crawlers, but the sites they point to don't appear to reference web crawlers (or even be available sometimes). I've seen other user agents aren't familiar to me, but I know they're bots because they include "bot" or "spider" or something similar in them.

    Read the article

  • Multiple domain with one host - SEO pov

    - by Swing Magic
    Lets say i currently have mycompanyname.com domain. and after several time i found that its very hard to hit top of serp. after searching i find there is domain that match with one of my keyword. i build website with 2 language. and im able to assign both of url with different language. my question, impact with SEO? it will have a load of duplicated content between those domain (image video etc). im afraid one of my website will have marked as plagiarist because many of content will be same. anyone experience the same condition? Thank you!

    Read the article

  • Google Authorship issues

    - by user29107
    I am facing the same issue , tried lots of time followed whols process after that also I am getting the same issue. Email verification has not established authorship for this webpage. Email address on the sanjeebpanda.com domain has been verified on this profile: Yes Public contributor-to link from Google+ profile to sanjeebpanda.com: Yes Automatically detected author name on webpage: Not Found. What to do. Please help

    Read the article

  • Webshop for digital goods with voucher / gift card system [duplicate]

    - by Kelzama
    This question already has an answer here: Which Ecommerce Script Should I Use? 1 answer I'm searching for a webshop which provides the following: The shop offers digital goods (like mp3) User can buy a voucher / gift card @Reseller (Or there is a code provided in the CD) User can enter his code @ webshop and gets the download (unregistered) User can enter his code @ webshop and download is added to his/her library (registered) optional: Resellers can buy codes from the Webshop I already tried prestashop as it looks quite nice. But it needs a lot of custom programming (and has a very strange voucher-system). Customer has to add the File into the basket and add the voucher at checkout. I want to skip that ;) Is there a Webshop (Or CMS + Plugin) which provides the things I need? (it could also be a CMS with a Storage/Folder Plugin (like joomla + K2) and a possibility to activate downloads via unique Codes.) Any ideas are highly appreciated :) Thanks in advance.

    Read the article

  • How to correctly track the analytics when using iframe

    - by Sherry Ann Hernandez
    In our main aspx page we have this analytics code <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-1301114-2']); _gaq.push(['_setDomainName', 'florahospitality.com']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview']); _gaq.push(function() { var pageTracker = _gat._getTrackerByName(); var iframe = document.getElementById('reservationFrame'); iframe.src = pageTracker._getLinkerUrl('https://reservations.synxis.com/xbe/rez.aspx?Hotel=15159&template=flex&shell=flex&Chain=5375&locale=en&arrive=11/12/2012&depart=11/13/2012&adult=2&child=0&rooms=1&start=availresults&iata=&promo=&group='); }); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Then inside this aspx page is an iframe. Inside the iframe we setup this analytics code <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-1301114-2']); _gaq.push(['_setDomainName', 'reservations.synxis.com']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview', 'AvailabilityResults']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The problem is I see to pageview when I go to find the AvailabilityResults page. The first one is a direct traffic and the other one is a cpc. How come that they have different source? I was expecting that both of them is using a direct traffic.

    Read the article

< Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >