Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 104/216 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Google search does not show sub-pages from my website

    - by Chang
    My website appears in Google search, but only the first page. Of course I have sub-pages linked from the first page, but the sub-pages do not show in Google search. Not in Yahoo, not in Bing. What should I do? It has been three years that sub-pages do not show. (I tried searching site:mydomain.com and pressed 'repeat the search with the omitted results included' link) What would you suspect the reason? My website addresses were like xxx.php?yy=zzz etc, etc, so I changed it to /yy/zzz using mod_rewrite. I thought it might be (X)HTML standard violations, so now I changed it. I hope Google will soon have my entire website, but I am a little bit pessimistic. Do you have any thought?

    Read the article

  • Text limit on analytics event code

    - by Theo G
    I am just about to add the event code a button that downloads the pdf. Event code fields: _trackEvent(category, action, opt_label, opt_value, opt_noninteraction) Example of event code: onClick="_gaq.push(['_trackEvent', 'Videos', 'Play', 'Baby\'s First Birthday']);" I was just wondering if anyone knows if there is a text limit on the opt_value? Do you think the following would be too long 'Elmhurst School says IPC has made all the difference'?

    Read the article

  • How does Wikipedia's SEO work?

    - by Josh Siegl
    i'm sorry if this question is misplaced or doesn't belong here. I'm currently developing an app for android and IOS and of course i'm thinking about the best ways to market it. Last night I Google'd somebody else's app and the third link in was a Wikipedia page on it, I never even thought of apps having Wikipedia pages, but alas there it was. And of course it was very helpful in determining exactly what the app did and in what cases it was useful for (something that's absolutely crucial for potential customers to understand). So then I got to thinking that I should create a Wiki for my app, but how does Wikipedia apply SEO? I know that the question could be overly complicated or specific, i'm just looking for general answers. For instance when somebody Google's my app, where does Wikipedia display on the results? When I create a Wiki for my app, how do I ensure that the Wikipedia page shows in the search results (is there any way to do that? ) I'm sure i'll find all of this out later when I create a Wiki for my app, I guess i'm just asking this out of curiosity. So how does Wikipedia's search engine optimization work? (on a page by page basis)

    Read the article

  • Google Cache showing wrong URL

    - by Sathiya Kumar
    I searched the cache details of the URL http://property.sulekha.com/pune-properties but the Google Cache showing details for property.sulekha.com. I don't know why it's showing like this. Not only for http://property.sulekha.com/pune-properties but also for all the Indian city relates URL's like http://property.sulekha.com/chennai-properties , http://property.sulekha.com/mumbai-properties , http://property.sulekha.com/kolkata-properties etc. Even i don't find these urls in the Google search result. If i search Chennai properties in Google, i find property.sulekha.com and not http://property.sulekha.com/chennai-properties . Why its happening like this? Please let me know

    Read the article

  • Do web crawlers/spiders index azure web sites?

    - by Clay Shannon
    For somebody who wants their web site to be as discoverable as possible (and who doesn't?), are Microsoft's Azure web sites (azurewebsites.net) a feasible domain to host sites? I have a site that is both on an azurewebsites.net and hosted under a completely different name by discountasp.net Both of these sites are exactly the same, except for the URL; whenever I update the code, I republish the site to/in both places. So obviosuly, they both have the same H1 and H2 elements. Searching for the value/content in my H1 tag, I find my .com site listed #3 on google and #2 on both Bing and Yahoo; OTOH, my azurewebsites.net site doesn't show up on the first page at all, in any of them. This makes me wonder if azurewebsites.net should only be used for Web API hosting and such-like, not for generic/commercial "public" sites. Are my conclusions valid?

    Read the article

  • which is the best CMS for building a Video tutorial website? [duplicate]

    - by Rajesh Sankar R
    This question already has an answer here: How to choose a CMS system for a small web site? 4 answers I am planning on setting up a website where I will be posting "how to or training videos" for various softwares. There will be several training videos under a single topic. And there will be several topics under a single software training. Similar to lynda.com but in a small scale. Can you suggest me a CMS for the job? Initially I am planning on hosting videos on YouTube and linking them to my website. But, I will be moving to my own hosting later on. It must be customizable, scalable, and most of all OPEN SOURCE

    Read the article

  • Ditch cPanel / WHM in favour of manual seup

    - by BWRic
    We currently use cPanel / WHM on a reseller account but are looking at getting a dedicated server. My first thought was to duplicate this set up on the dedicated box to allow us to quickly create new accounts. I'll be a managed server so they'll have set up the LAMP stack. I'm curious if I actually need cPanel and WHM. We don't use many of the features from cPanel / WHM, just creating accounts and databases, clients do not have FTP access. I'm no sys admin and come from a Windows / GUI background but have some knowledge in setting up development servers. WHM: Creating accounts I presume this sets up the Apache virtual host, FTP access and DNS settings. I've some knowledge of editing the Apache files to create virtual hosts. Am I correct in thinking as long as the DNS is pointing to the server IP and the virtual host is configured the server can serve the (php) pages? I'm not sure I need per site FTP access as only we will have access so I could have a server wide/htdocs only access to view all the site. The company who supply the dedicated hosts would also provide the own DNS management tool so I'm not need to cPanel one. MySQL: Creating users and databases We use cPanel to create the MySQL users and databases. As it's a dedicated box and I can have root access I think this could be replaced by SQLyog for db management and phpMyAdmin for user management. Do you I need cPanel or can I get by editing a few text files for creating the accounts, then use the MySQL tools for databases? Or am I missing something major with how the sites are configured?

    Read the article

  • Wiki Application With A Reputation System

    - by Christofian
    I'm really impressed with Stack Exchange's concept of reputation (you gain reputation as you post, and the more you post, the more privileges you get), and I want to apply the concept to a wiki that I am building. Does anyone know of a php wiki that has a concept of privileges/reputation similar to Stack Exchange? I'm not necessarily looking for something identical to SE, I'm just looking for a wiki application that gives users more privileges the more they contribute positively to the wiki (SE has down votes, the wiki should have some way of identifying negative contributions too). The privileges should be category based, so the more active you are in a specific category or page, the more privileges you get for that category. There should also be site wide privileges as well, though those should be harder to access than the category privileges. NOTE: If it is not possible to get category wide privileges and site wide privileges, I will be OK with just category wide privileges or just site wide privileges. I should be able to change the requirements for each privilege, through a administration panel or through editing a file (some wiki applications don't have administration interfaces). Does anyone have a script or a solution that will do this? If the script uses something similar to reputation to determine how much a user has positively contributed to the site, then that is OK too. Please Note: I am looking for a way to rate individual user contributions, not a way to rate the quality of an entire page.

    Read the article

  • is a merchant account a requirment for a website to take payments..

    - by calum
    Hi, I have had a quick look but couldn't see anything related. Basically, if we were to accept payments for events on our website, via paypal (essentially a Buy it now! button), as a business, do we need a merchant's account, or will a regular bank account be acceptable? I may have some confusion in terms. My understanding is you need a merchant's account to accept credit card payments, but as we are using PayPal, is this necessary? Thank you for any clarification. disclaimer - I've read What are some options for taking payments on my website? but it doesn't explicitly say if we require a merchant account or not. Thank you.

    Read the article

  • Server-infrastructure recommendations

    - by Tim van Elsloo
    Here's the thing: I need a cheap, fast, reliable infrastructure that can dynamically scale (like Amazon S3: cloud-storage). I'm thinking of 3 different type of 'servers'. Application-server Should be able to run CentOS (or another light Linux-distr.) Should be able to run Apache Should be able to run PHP Should be able to run GD (so it does rely on it's cpu). Should be extremely reliable and fast. Database-server Should be able to run MySQL Should be able to... well, do nothing else :P. Should be extremely reliable and fast. Storage-server Should be able to run some kind of file-transfer-deamon (like FTP, CouchDB, etc.) Should be able to do nothing else. Should be extremely reliable and fast. So technically, by transferring all static data to 2 different servers/services, the application-server can totally focus on the webpages. My questions: What services do you recommend? Which is cheaper, faster and more reliable: using my own server, or using some cloud-storage/cloud-computing-service (like Amazon S3, CloudFiles, etc.)? How can I prevent bandwidth abuse (such as dos-attacks causing the bill to be extremely high)? What's the difference between "including CDN" and "excluding CDN"? It seems the price doesn't differ at CloudFiles? Do you have to pay "including CDN" + "excluding CDN" when you decide to enable the delivery-network? Or have you only got to pay "including CDN"? Should I use my own nameserver too or can I use my domain-hoster's nameservers? What are the minimum software specifications of a nameserver. Can I write some software myself? Does anyone have a good protocol-description? I hope you can answer my questions. Answers I shouldn't write my own nameserver-software. Instead, I should use something like bind. (http://osspro.com/2010/05/04/linux-create-your-own-domain-name-server-dns/).

    Read the article

  • Disqus integration in website.. what is wrong??

    - by Thieme Hennis
    hi, I try to embed a disqus forum in a website I created. I used the exact code and instructions they give on the installation instructions. I just don't get it. Not much on Google either. Is something wrong in the code? Should I change anything? <!DOCTYPE HTML> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <link rel="shortcut icon" href="../favicon.ico"> <title>Little Louie | Hennis &amp; Blaisse Lovers Productions</title> <META NAME="keywords" CONTENT="some,tags"> <link href="../style2.css" rel="stylesheet" type="text/css"> </head> <body> <p><b>Some text</p> <p> <div id="disqus_thread"></div> <script type="text/javascript"> (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = 'http://littelouie.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); </script> <noscript>Please enable JavaScript to view the <a href="http://disqus.com/?ref_noscript=littelouie">comments powered by Disqus.</a></noscript> <a href="http://disqus.com" class="dsq-brlink">blog comments powered by <span class="logo-disqus">Disqus</span></a> </p> <p> <script type="text/javascript"> var disqus_shortname = 'littelouie'; (function () { var s = document.createElement('script'); s.async = true; s.src = 'http://disqus.com/forums/littelouie/count.js'; (document.getElementsByTagName('HEAD')[0] || document.getElementsByTagName('BODY')[0]).appendChild(s); }()); </script> </p> </body> </html>

    Read the article

  • Synchronise Database between servers via php [closed]

    - by Emmanuel
    Hi Guys, I'm needing to synchronise two mysql databases between different servers on a regular basis, by a client-initiated interface. I've been doing it by remote MYSQL connection, and adding the IP of the servers to the whitelist for MYSQL remote connections. Problem is however, that the client has a dynamic IP, so as soon as it changes they can no longer sync. So I'm trying to find an alternative way of synchronising the two databases via some sort of secure php script.

    Read the article

  • Which E-commerce Platform works well with Flash Product Customization+Social?

    - by Artur
    What's the best platform out there that is flexible enough to easily integrate this: Custom Flash App I would like customers to : 1 - Select a t-shirt from a gallery of artists. 2 - Customize it ( using a Flash tool i created ) 3 - Select a T-shirt size 4 - Order it. All this flash widget does is generate a JPG on the server. the ecommerce app should assign it to that Order/Customer, and add it to their shopping cart. Social Features Customers should also be able to comment on the t-shirts and artist bios. I was thinking of trying Wordpress plugins like Shopp or Getshopped or Cart66. ----- then BuddyPRess for social features. Or is Magento a better choice? thanks!

    Read the article

  • Purchase existing domain and transfer to new registrar

    - by Kiefer
    I am purchasing an existing domain from the owner who has it registered with GoDaddy. I want to transfer the domain to another registrar and of course have it under my name. If they update the registrant info to my name then it will lock down for 60 days. That's no good. If they simply transfer it to my registrar, how will they update the registrant info? I know about escrow services, but I don't feel I need one because I trust the seller and the amount is (relatively) small. Advice? Thanks!

    Read the article

  • SH404SEF URLs in Joomla 1.5

    - by Tao Bellamine
    I have two modules to play with urls, the global configuration module and the sh404sef module. The global config is set to "Sef urls: YES" and "mod rewrite enabled: YES" and the sh404sef is set "url optimization: NO". My problem is, even with "Sef urls" set in the global config, my urls still don't seem to be that "user friendly" so I turn on the "Url optimization" using the sh404sef module, and I get better descriptive urls. However, the problem I inherit from doing this is that my dynamically populated chronoforms get messed up (only the chrono forms, other forms are fine); These forms are now showing up at the homepage instead of their own reserved page. Here's an example: Old form "GOOD" url: http://www.mycraftwork.com/index.php?option=com_content&view=article&id=94 New optimized "BAD" URL: http://www.mycraftwork.com/handthrown-pottery/alladin-teapot/index.php?option=com_content&view=article&id=94 Any help would be GREATLY appreciated! I can even turn the sh404sef on and off if some people are interested in seeing the issue LIVE. Thanks!! Tao Bellamine

    Read the article

  • How much a website like bytes.com earn

    - by robin das
    i am running a website similar to bytes.com (IT QnA site) my site attract 600 unique visitors daily. we are planning our self as bytes.com. Atleast any one can tell me that whether we can earn some serious money with this kind of website. websiteoutlook estimate Daily Pageview - 700636 to bytes.com Can any one please let me know what kind of earning we can expect for dotcom like bytes.com. Please give some light on this topic as a lot of energy, time and money goes in building this kind of website thanks robin Das

    Read the article

  • Google's process for publishing/modifying pages [closed]

    - by Glenn Dayton
    I'm assuming that a group of people at Google have control of certain sections of google.com, but how does Google make sure that employees don't accidentally or intentionally sabotage the website? Does Google use Adobe Contribute or some similar product for sharing/publishing the website. Do employees use WebDAV, FTP, SFTP, or SSH to publish the site. Since Google has hundreds of thousands of servers it probably takes some time for its servers to update. Do they transmit the new copy of the website to all servers before publishing at once? This question does not apply to Google editing a database and having a page reflect the database's changes. It applies to employees editing the source code and/ or back end of the site.

    Read the article

  • Having good domain name and using domain aliases ( I use notlong.com)?

    - by Michal P.
    I use only free servers and after creating my website: http://pundaquit.republika.pl I decided to make access to that domain by simple domain name . I decided to use domain alias http://notlong.com/ service and have simple domain name http://pundaquit.notlong.com The second advantage of using alias here was to be independant from my file host which I will have to change. I haven't found a better alias service like notlong, because notlong.com is easy to remember. After that I encounter many problems: * most of forums or social services treat notlong adress as a spam, * Bing so far hvn't accepted http://pundaquit.notlong.com domain and others. Is it another way to have good free domain name? How about the situation when your hosting server will inform you to expire? Only a lasting layer of domain aliases make you independant from the real file hosts.

    Read the article

  • Looking for a CDN

    - by Bill
    Most of the CDN's that I've seen require you to upload your content in advance. I'm looking for a CDN that, upon receiving a request for a resource it hasn't seen, will contact my application server. If the application server returns something, it should be sent to the user and then cached in the CDN. If not, it should just return a 404. If the user requests an unexpired item, the CDN should just serve it without bothering my app server. Does anything like this exist? Is there a way to get Cloudfront to work like this?

    Read the article

  • Why my domain redirect on Google Apps is returning 404?

    - by Tom Brito
    I have a configuration in the Google Apps Control Panel (dcc.securepaynet.net) to redirect <tombrito.com to <http://buscatextual.cnpq.br/buscatextual/visualizacv.do?id=K4499244H9. It worked fine until some days ago, but now it's returning 404. If you access tombrito.com you can see the favicon in the title of the browser tab, but the page shows a 404 error. The target page <http://buscatextual.cnpq.br/buscatextual/visualizacv.do?id=K4499244H9 is fine, it's only some problem with my redirect. Any idea what's wrong here?

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • "X-Robots-Tag: noindex" on an HTTP 301 response

    - by Peter O.
    I understand that a resource with X-Robots-Tag: noindex forces some search engines, including Google, not to index the resource further. I also understand that an HTTP 301 response causes search engines to use the redirected URL instead of the original URL to refer to the resource. But what happens if both "X-Robots-Tag: noindex" and status code 301 occur on the same response? It's likely that the original URL will no longer be indexed, but will that cause the redirected URL to no longer be indexed too? This possibility is not mentioned in the X-Robots-Tag specification.

    Read the article

  • rel="nofollow" SEO impact

    - by Torez
    I saw a technique used where there was a block with three parts: 1. Image (wrapped in an anchor tag) 2. Heading (anchor tag with heading text) 3. Paragraph (regular p tag with synopsis content) e.g. <li class="block"> <a rel="nofollow" class="thumb" href="#"><img src="images/placeholder_service_thumbnail.jpg" alt="" /></a> <a class="h3" href="#"Good SEO Heading</a> <pPellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Vestibulum tortor quam, feugiat vitae, ultricies eget, tempor sit amet, ante. Donec eu...</p> </li> With the image tag there was a rel="nofollow" on the wrapped anchor tag. So the idea is that the users still has the ability to click the image and go to the details page, but the image link does not rank. When users click on the heading text, that is only what ranks for that specific page. Q: Is this the correct approach? Does this even do anything? What is the best practice?

    Read the article

  • QapTcha error issue. Works locally, not on live server [migrated]

    - by BlassFemur
    I am adding QapTcha (http://demos.myjqueryplugins.com/qaptcha/) to a website that I am working on and I'm getting the error "Uncaught TypeError: Cannot read property 'error' of null". What's weird to me is everything is working perfectly locally. No errors or anything. Once I uploaded via ftp to the live server, I get the above error. Below is the block of code that seems to be generating the error: Slider.draggable({ revert: function(){ if(opts.autoRevert) { if(parseInt(Slider.css("left")) (bgSlider.width()-Slider.width()-10)) return false; else return true; } }, containment: bgSlider, axis:'x', stop: function(event,ui){ if(ui.position.left (bgSlider.width()-Slider.width()-10)) { // set the SESSION iQaptcha in PHP file $.post(opts.PHPfile,{ action : 'qaptcha', qaptcha_key : inputQapTcha.attr('name') }, function(data) { if(!data.error) Uncaught TypeError: Cannot read property 'error' of null { Slider.draggable('disable').css('cursor','default'); inputQapTcha.val(''); TxtStatus.text(opts.txtUnlock).addClass('dropSuccess').removeClass('dropError'); form.find('input[type=\'submit\']').removeAttr('disabled'); if(opts.autoSubmit) form.find('input[type=\'submit\']').trigger('click'); } },'json'); } } }); I'm not really sure what's going on as to why it works locally and not on the server. Any help/ suggestions would be appreciated. Thanks

    Read the article

  • Is there an easier way to implement 301 redirects when converting a site to WordPress

    - by Amanda
    I have just converted a website to WordPress. The old site has hundreds of hard-coded html files, and the new site does not match the old site's directory structure or file naming system (bad SEO in the original site), so I can't place any "blanket" 301 redirects. Its been at least 2 months, and the old links are still appearing in Google searches, despite a google-friendly sitemap.xml. Do I need to hardcode a 301 for every individual page in my htaccess file, or am I just misunderstanding 301s and apache? Is there some other way I can update Google about the fact that my entire site structure has changed?

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >