Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 116/390 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • Why google is not crawling my website

    - by Aman Virk
    I am running a design and development blog http://www.thetutlage.com/ . From last couple of days my search traffic have been reduced from 70% to 10%. I myself is against black hat seo and all it do is write my own unique content almost everyday. Last week my search traffic was really good but now is dropping like heck. I have checked my webmasters dashboard and no message there from google. When i checked server logs i came to know last time google crawled my website was on 27 september 2012. Really i have no idea what i am doing wrong. I follow all google guidelines like bible, please help me

    Read the article

  • Does Bing support anything like Google's First Click Free program?

    - by Dan Fabulich
    Google has a program for webmasters called First Click Free. To implement First Click Free, you need to allow all users who find a document on your site via Google search to see the full text of that document, even if they have not registered or subscribed to see that content. The user's first click to your content area is free. However, once that user clicks a link on the original page, you can require them to sign in or register to read further. The user must be able to see the full content of a multi-page article. You can allow this by displaying all content on a single page to both Googlebot and users. Alternatively, you can use cookies to make sure that a user can visit each page of a multi-page article before being asked for registration or payment. Does Bing support anything like this?

    Read the article

  • Internet Explorer and margins

    - by Hailwood
    Hi there. I have some pretty simple html which is meant to make a layout as below. To push the tabs down from the userbar I am using margin-top: 35px; However in internet explorer the tabs are completly misaligned(the top of the tabs is where the bottom should be). So I need to use margin-top: -50px; for internet explorer. Why is this and how can I fix it without using a ie specific stylesheet <div id="pageHead"> <div id="userBar"> <span class="bold">Hi Matthew Hailwood | <a href="#">Logout</a> </div> <a href="http://localhost/buzz/" id="pageLogo"></a> <div id="pageTabs" class="clearfix"> <ul> <li><a href="http://localhost/buzzil/templates">Templates</a></li> <li><a href="http://localhost/buzzil/messaging">Messaging</a></li> <li><a href="http://localhost/buzzil/contacts">Contacts</a></li> </ul> </div> </div> With the css being #pageHead { height: 100px; } #pageLogo { float: left; width: 149px; height: 77px; margin-top: 11px; background: transparent url('../images/logo.png') no-repeat; } #userBar { text-align: right; color: #fff; margin-top: 10px; } #userBar a:link, #userBar a:visited, #userBar a:active { font-weight: normal; color: #E0B343; text-decoration: none; } .clearfix:after { content: "."; display: block; clear: both; visibility: hidden; line-height: 0; height: 0; } .clearfix { display: inline-block; } html[xmlns] .clearfix { display: block; } * html .clearfix { height: 1%; } #pageTabs { float: right; margin-top: 35px; } #pageTabs ul { position: relative; width: 100%; list-style: none; margin: 0; padding: 0; border-left: 1px solid #000; } #pageTabs ul li { float: right; background: url(../images/tabsBg.png) no-repeat 0% 0%; border-left: 1px solid #000; margin-left: -1px; } #pageTabs ul li a:link, #pageTabs ul li a:visited, #pageTabs ul li a:active { color: #fff; background: url(../images/tabsBg.png) no-repeat 100% 0%; display: block; font-size: 14px; font-weight: bold; line-height: 42px; text-transform: uppercase; padding: 4px 32px; text-decoration: none; } #pageTabs ul li a:hover, #pageTabs ul li a:focus { text-decoration: underline; }

    Read the article

  • How do you set up the directory structure for a multilingual site without duplicating content?

    - by Ricardo
    I want to make a website in two languages. I've looked around and settled on the directory option of separating both languages. How do I make it work? Let's say I have the following three files for the landing homepage, the English page and the Spanish page: http://www.domain.com/index.html http://www.domain.com/en/index.html http://www.domain.com/es/index.html Let's also say that /index.html will be in English, with a link to /es/index.html. In turn, /es/index.html will have a link to the English version. Would this be back to /index.html or to /en/index.html. How do I get both English versions (the one at the root and the one in the directory) to actually be the same file in the same directory? I'm new to this, so I'm not using any scripts yet. To me, the obvious solution is to duplicate both English versions and have the one at the root point to files under the /en/ directory, but I'm not a fan of duplication and I've learned that search engines really frown upon that. Anyone point me in the right direction?

    Read the article

  • Why do people crawl sites without downloading pictures?

    - by Michael
    Let me show you what I mean: IP Pages Hits Bandwidth 85.xx.xx.xxx 236 236 735.00 KB 195.xx.xxx.xx 164 164 533.74 KB 95.xxx.xxx.xxx 90 90 293.47 KB It's very clear that these person are crawling my site with bots. There's no way that you could visit my site and use <1MB bandwidth. You might say that there's the possibility that they could be browsing the site using some browser or plug-in that does not download images, js/css files, etc., but the simple fact of the matter is that there are not 90-236 pages that are linked from the home page (outside of WP files), even if you visited every page twice. I could understand if these people were crawling the site for pictures, but once again, the bandwidth indicates that this isn't what is happening. Why, then, would they crawl the site to simply view the HTML/txt/js/etc. files? The only thing that I can come up with is that they are scanning for outdated versions of WordPress, SQL injection vulnerabilities, etc., which makes me inclined to outright ban the IPs, but I'm curious, is it possible that this person is a legitimate user, or at the very least, not intending to be harmful?

    Read the article

  • How to handle non-existent subdirectories?

    - by Question Overflow
    I have a dynamic website with friendly URLs. Example: Instead of /user.php?id=123, I have /user/123 Instead of /index.php?category=fishes, I have /fishes But, how do I handle non-existent subdirectories such as /about/123? Currently it gives a 200 success instead of a 404 not found error. Is there a way to deal with non-existent subdirectories in Apache config and at the same time allow for friendly URLs? Or do I have to handle this individually for each PHP script?

    Read the article

  • Trade off: Lower the number of URLs in sitemap from 43k to 23k or update the sitemap.xml only weekly basis

    - by Tobias
    we rewrote the sitemap creation process. Now the sitemap contains 43.000 URLs. 20k more than before. We have daily changing in URLs. The script that is creating the complete sitemap takes more than 30h. So we can not build it every day. Lets say that increasing the speed of the script is not possible. What should I do? A: Stay with the 23k URLs and update it daily B: Increase number of URLs to 43k and update it weekly

    Read the article

  • PHPForm Generate PDF Send to Email

    - by tom
    I'm a beginner in PHP I was wondering if this is easy to do or if i'd have to outsource this to a programmer - Basically when a user fills in the PHP Form and submits it I need this to generate as a PDF which will then email/attach to MY email and NOT the user who submitted this form. I have looked at tcpdf, fpdi but i dont think any of those scripts allow me to do this specifically as from what i heard it generates a download link for the user, and that is not what i need. If anyone can help me it would be greatly appreciated. Regards Tom

    Read the article

  • Temporary website redirect: 3xx or php/meta?

    - by Damien Pirsy
    Hi, I run a (small) news website which has also a forum in a subfolder of the root. I'm planning to give the site a facelift and a code restructuration, so I wanted to put some redirect on the home page that will point directly to forum's index (www.mysite.com -- www.mysite.com/forum) while I tinker with it. And that, given the little free time I have, will take no less than a couple of month. Being a news site I'm pretty sure that would affect it's overall ranking, but I need to do it, so: which is the best way to redirect? I pondered and read here and there about the different means, but I couldn't figure out which is worst for SEO. Do I use a 302 redirect or use "Location:newurl" in page headers using php? Or I just put a meta tag in the html page (or a javascript, what's better). Sorry but I'm not really into these things, I may have said something silly, I know... Thanks

    Read the article

  • Implications on automatically "open" third party domain aliasing to one of my subdomains

    - by Giovanni
    I have a domain, let's call it www.mydomain.com where I have a portal with an active community of users. In this portal users cooperate in a wiki way to build some "kind of software". These software applications can then be run by accessing "public.mydomain.com/softwarename" I then want to let my users run these applications from their own subdomains. I know I can do that by automatically modifying the.htaccess file. This is not a problem. I want to let these users create dns aliases to let them access one specific subdomain. So if a user "pippo" that owns "www.pippo.com" wants to run software HelloWorld from his own subdomains he has to: Register to my site Create his own subdomain on his own site, run.pippo.com From his DNS control panel, he creates a CNAME record "run.pippo.com" pointing to "public.mydomain.com" He types in a browser http://run.pippo.com/HelloWorld When the software(that is physically run on my server) is called, first it checks that the originating domain is a trusted one. I don't do any other kind of check that restricts software execution. From a SEO perspective, I care about Google indexing of www.mydomain.com but I don't care about indexing of public.mydomain.com What are the possible security implications of doing this for my site? Is there a better way to do this or software that already does this that I can use?

    Read the article

  • Is there a search engine that indexes source code of a web-page?

    - by Dexter
    I need to search the web for sites that are in our industry that use the same Adwords management company, to ensure that the said company is not violating our contract, as they have been accused of doing. They use a tracking code in the template of every page which has a certain domain in the URL, and I'm wondering if it's possible "Google" the source code using some bot that crawls the code rather than the content? For example, I bought an unlimited license for an image gallery, and I was asked to type the license number in a comment just before the script. I thought it was just so a human could look at the source and find out if someone paid, but it turned out that it was actually that they had a crawler looking for their source code and that comment. If it ran across the code on your site, it would look for the comment, and if it found one, it would check to see if it was an existing one. If not, it would first notify you of your noncompliance, and then notify the owner of the script. Edit: I'm looking to index HTML and JavaScript only, not the server-side languages or Java.

    Read the article

  • Set IP address to point to certain domain

    - by silvercover
    I have a Linux VPS, DirectAdmin as web panel and already set a domain to it. everything is OK and I can see my website in my browser using domain name. Now I need to have access to my site using its IP address. something like http://86.57.88.29, but when I try to load my site in a browser using its IP I get below message and I have to post-fix my IP with /~admin (http://86.57.88.29/~admin) to get it work. This IP is being shared among many domains. To view the domain you are looking for, simply enter the domain name in the location bar of your web browser. So how can I configure my IP to point to my public_html folder without and ~admin like phrase? Thanks.

    Read the article

  • Please explain some of the features of URL Rewrite module for a newbie

    - by kunjaan
    I am learning to use the IIS Rewrite module and some of the "features" listed in the page is confusing me. It would be great if somebody could explain them to me and give a first hand account of when you would use the feature. Thanks a lot! Rewriting within the content of specific HTML tags Access to server variables and HTTP headers Rewriting of server variables and HTTP request headers What are the "server variables" and when would you redefine or define them? Rewriting of HTTP response headers HtmlEncode function Why would you use an HTMLEncode in the server? Reverse proxy rule template Support for IIS kernel-mode and user-mode output caching Failed Request Tracing support

    Read the article

  • Using env variables with RewriteRule and ErrorDocument

    - by misterte
    Hi, I'm having problems with the following while config. my Apache server to Rewrite some urls. SetEnv PATH_TO_DIR /directory RewriteRule ^%{PATH_TO_DIR}/([a-zA-Z0-9_\-]+)/([a-zA-Z0-9_\-\.]+)/?$ /index.php?dir=$1&file=$2 ErrorDocument 404 %{PATH_TO_DIR}/index.php?dir=null&file=error This conf. used to work perfectly fine until I used SetEnv PATH... etc. I need to use this because there are lots of rules, not just those. Can anyone point out my mistake? Apache returns %{PATH_TO_DIR}/index.php?dir=null&file=error when I try anything (www.site.com/foo/bar for instance). Apache returns the ErrorDocument if i just try to fetch the index. I know it's not a problem with the rewrite rules because they work when I remove the PATH_TO_DIR variable and just hard code it. Thanks! A.

    Read the article

  • Multiple sites redirected to one main site

    - by mattgcon
    I have a client who insists of having multiple website domains all being redirected to one main website domain. It is getting out of hand and his server has become conveluted and riddled with garbage because of it, not to mention confusing at times. Each of these domains that he is setting up has no content, they simply redirect the user to the main website domain. Is this practice of having multiple domains pointing to one main website common? And does anyone know where I can get information to give to this client to let him know this is a bad practice if it is a bad practice?

    Read the article

  • Shifting from no-www to www and browsers' password storage

    - by user1444680
    I created a website having user-registration system and invited my friend to join it. I gave him the link of no-www version: http://mydomain.com but now after reading this and this, I want to shift to www.mydomain.com. But there's a problem. I saw that my browser is storing separate passwords for mydomain.com and www.mydomain.com. So in my friend's browser his password must have been stored for no-www. That means after I shift to www and next time he opens the login page, his browser wouldn't auto-fill the username and password fields and there will also be an extra entry (of no-www) in his browser's database of stored passwords. Can this be avoided? Can I do something that will convey to browsers that www.mydomain.com and mydomain.com are the same website? I already have a CNAME record for www pointing to mydomain.com but it seems that search engines consider CNAME as alias but browsers consider them as different websites, I don't know why.

    Read the article

  • Canonicalization issue regarding academic URL vs. blog URL

    - by user5395
    I'm sorry if what I am about to write is long-winded. I only wish to be clear. I am an academic in the scientific community. I maintain a web site for my research, teaching, and other professional activities. Until recently, the content for this site was hosted in a directory on my university department's own server. The address is of the typical form (universityname).edu/~(myusername) I decided that I wanted to use WordPress in order to host and manage my page. So I set up a WordPress.com blog and then replaced the index.html file in (universityname).edu/~(myusername) with a new one consisting of a single frame, containing the WordPress.com blog. Now when a user visits (universityname).edu/~(myusername), he or she sees the blog instead. This has been pretty nice because, even when the user clicks on links between pages or posts in the blog, the only thing showing up in the address bar of the browser is www.(universityname).edu/~(myusername), because the blog is constrained to a frame. However, the effect of this change on the search side of things has not been so kind to me. Before, when someone searched for my name in Google, the first result was always (universityname).edu/~(myusername). This is the most desirable outcome, for professional reasons. (Having my academic URL come up first suggests that I am an accredited professional, and not just some crank with a blog!) But now, Google seems to have canonicalized my web presence under the blog's WordPress.com address. It has completely forgotten about my academic URL and considers the WordPress.com address to be the best address representing me on the web. Unfortunately, WordPress.com doesn't support the canonical tag, so I can't tell the blog to advertise itself as my academic URL in the header. (It doesn't seem to help at all that I have used the WordPress.com dashboard to turn on no-indexing of the blog.) One obvious solution would be to use the departmental server to host my content again, and use a local installation of the WordPress platform. For reasons beyond my control, the platform will not be deployed on the departmental server at this time. Another solution would be to use shared hosting with WordPress.org support, because the WordPress.org platform does support the canonical tag (albeit via a plug-in). But this seems to usually require purchasing a domain name and other fees, and there is no guarantee that Google will listen to the canonical tag (it might use whatever domain name I end up with instead). Is there a way I can more cleverly integrate the WordPress.com blog into a page hosted on my department's server? Is there some PHP code I can write to retrieve the blog's contents in a way that Google won't treat as a link / "perceive" the blog? Please note: I am a PHP novice at best. I just feel there should be a simpler solution to all this, within the constraints of what I have described above. Thanks!

    Read the article

  • Making own clothes website [on hold]

    - by Manjushree
    I am BSc student in Mathematics but i would like to create own clothes website. Can anyone help me how can i design the clothes website. I never have any background knowledge about making the webpage online. The clothes website does not have to look professional but simple enough where i can put my clothes to show the items to people or customers. Once I created the clothes website then i can open the business account and starting selling the goods online with that account. Do i need to buy any domains to create the website? Please help me?

    Read the article

  • Default Wordpress site on IIS

    - by Mike
    We have multiple wordpress installations on our IIS7 (Windows Server 2008) Server as follows: http://www.example.com/site_one http://www.example.com/site_two http://www.example.com/site_three These all work properly. However we would like to configure it so that when users visit the root domain (http://www.example.com/) or any page underneath, ie: http://www.example.com/ http://www.example.com/page1 http://www.example.com/page2 They would actually see the corresponding pages for site_two: http://www.example.com/site_two/ http://www.example.com/site_two/page1 http://www.example.com/site_two/page2 How could we achieve this?

    Read the article

  • Multiple domain with one host - SEO pov

    - by Swing Magic
    Lets say i currently have mycompanyname.com domain. and after several time i found that its very hard to hit top of serp. after searching i find there is domain that match with one of my keyword. i build website with 2 language. and im able to assign both of url with different language. my question, impact with SEO? it will have a load of duplicated content between those domain (image video etc). im afraid one of my website will have marked as plagiarist because many of content will be same. anyone experience the same condition? Thank you!

    Read the article

  • What are some client-side sitemap generators?

    - by BHare
    Most of the sitemap generators I've found all scan my internal files and base it on that. However using apache and htaccess I have many aliases for things for example: /brand/model/photos/ is empty directory wise, but on the web it relocates it to a zenphoto gallery using htaccess. http://www.xml-sitemaps.com/ does What I need, but I'd like to have more control over it and I need it to be dynamic and run weekly or daily. Allow for specific change frequencies based on the file/directory. I also want to be able to make a "pretty" sitemap html file. Counting the photo gallery I have around 300 links, without the photo gallery more like 100. I have MySQL, PHP 5.3 installed.

    Read the article

  • How can I decrease relevancy of Creative Commons footer text? (In Google Webmaster Tools)

    - by anonymous coward
    I know that I may just have to link the image to make this happen, but I figured it was worth asking, just in case there's some other semantic markup or tips I could use... I have a site that uses the textual Creative Commons blurb in the footer. The markup is like so: <div class="footer"> <!-- snip --> <!-- Creative Commons License --> <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/80x15.png" /></a><br />This work by <a xmlns:cc="http://creativecommons.org/ns#" href="http://www.xmemphisx.com/" property="cc:attributionName" rel="cc:attributionURL">xMEMPHISx.com</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. <!-- /Creative Commons License --> </div> Within Google Webmaster Tools, the list of relevant keywords is heavily saturated with the text from that blurb. For instance, 50% of my top-ten most relevant keywords (including the site name): [site name] license [keyword] commons creative [keyword] alike [keyword] attribution [keyword] I have not done any extensive testing to find out rather or not this list even matters, and so far this doesn't impact performance in any way. The site is well designed for humans, and it is as findable as it needs to be at the moment. But, out of mostly curiosity: Do you have any tips for decreasing the relevancy of the text from the Creative Commons footer blurb?

    Read the article

  • Creating country specific twitter/facebook accounts

    - by user359650
    I see many companies that have an international presence trying to localize their social media presence by creating country or language specific accounts. However some seemed to have done so without following a consistent pattern, one example being the World Wildlife Fund when you look at their Twitter accounts: World_Wildlife : verified account with 200K followers WWF : main account with 800K followers www_uk : lower case with underscore between WWF and country indicator WWFCanada : upper case with country indicator attached to WWF ... I am planning to build a website which hopefully will grow global and would like to avoid this sort of inconsistencies. Also, I was comparing what Twitter and Facebook allow in their username and found out that they don't allow the same characters to be used (e.g. for instance that the former doesn't allow . whereas the latter does) making difficult to ensure consistency across social networks. Hence my questions: Are there known naming schemes for creating localized Twitter and Facebook accounts while maintaining a certain consistency between them (best effort)? Are there any researches out there that have proven whether some schemes were better than others in terms of readability and/or SEO?

    Read the article

  • Cloud hosting vs dedicated hosting: advantages and disadvantages

    - by bcmcfc
    I'm currently looking for a hosting company that can provide a very solid service with a 100% SLA. In the search both cloud hosting and managed dedicated hosting have come up. (I'd rather not manage the server myself as I'm still rather new to Linux.) I'm not sure if phrasing this as a "which is best" would make sense, but what advantages does cloud hosting have over dedicated server hosting? I need a reliable service above all else, and some elements of the application to be hosted will be relatively CPU intensive, although those spikes in CPU usage will be sporadic, so the hosting needs to be able to deal with that.

    Read the article

  • Google Adsense threshold estimation

    - by Wladimir Ivanov
    I've an electronic music blog with traffic mainly from the North American continent, Western Europe and Russia. Daily I get about 100 unique visitors with 150-200 pageviews. Should I start Adsense or I need to work to increase the traffic stats. Can you suggest another appropriate monetizing option for the given case? How much time It would take me to hit the 100$ Adsense barrier with the given traffic statistics? Thanks in advance.

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >