Search Results

Search found 9757 results on 391 pages for 'shekhar pro'.

Page 58/391 | < Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >

  • Blogging platform for shared hosting?

    - by Ankit
    First of all, this is not for flame wars. So, please do not bash some blogging platform :P I have a shared hosting account. I want to setup a blog on it. Now, the thing is that as it is shared hosting, I do not have as much bandwidth as some expensive hosting. I have tried using Wordpress with the caching plugins configured. It still does not seem to speed it much. Also if I create a simple PHP website, the speed is fine. Can someone suggest me some lightweight platform?

    Read the article

  • Does hiding images on 404 error affect SEO?

    - by Question Overflow
    I have a dynamic website that allows registered users to upload and display images on the their profile page. As each user may upload less than the maximum limit of 20 images, there would be some "empty" images on the page. I am using javascript to hide these empty images. The loading of the profile page would generate a series of 404 errors depending on the number of empty images. Would these 404 errors affect the SEO of the page and the website?

    Read the article

  • How should I update my name server after I installed a new dedicated server?

    - by Jim Thio
    Say I got a dedi. The IP is 123.123.123.123 Now I got domain name domainname.com that will be the "main" domain name for that server. Should I? Set the name server of the domainname.com to ns1.domainname.com and ns2.domainname.com Add child nameserver ns1.domainname.com and ns2.domainname.com to point to that exact IP. or Should I? Point the name server to my registrar name server. Set an A address of the name server to point to my IP. Which one is right? Obviously I want ns1.domainname.com and ns2.domainname.com to point to my IP so I can then point hundreds of domains to that IP. But how exactly I should do that? Specifically I simply use cpanel. Centosh with cpanel.

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • IIS MIME type for XML content

    - by Rodolfo
    recently a third party plugin I'm using to display online magazines stopped working on mobile devices. According to their help page, this happens for people serving with IIS. Their solution is to set the MIME type .xml to "application/xml". It's by default set to "text/xml". Changing it does work, but would that have unintended side effects or is it actually the correct way and IIS just set it wrong?

    Read the article

  • Thousands of 404 errors in Google Webmaster Tools

    - by atticae
    Because of a former error in our ASP.Net application, created by my predecessor and undiscovered for a long time, thousands of wrong URLs where created dynamically. The normal user did not notice it, but Google followed these links and crawled itself through these incorrect URLs, creating more and more wrong links. To make it clearer, consider the url example.com/folder should create the link example.com/folder/subfolder but was creating example.com/subfolder instead. Because of bad url rewriting, this was accepted and by default showed the index page for any unknown url, creating more and more links like this. example.com/subfolder/subfolder/.... The problem is resolved by now, but now I have thousands of 404 errors listed in the Google Webmaster Tools, which got discovered 1 or 2 years ago, and more keep coming up. Unfortunately the links do not follow a common pattern that I could deny for crawling in the robots.txt. Is there anything I can do to stop google from trying out those very old links and remove the already listed 404s from Webmaster Tools?

    Read the article

  • Issue with Godaddy DNS manager

    - by Fischer
    I'm using domains.live.com to setup an email to a domain registered on Godaddy. The domains.live.com configuration page says: Godaddy's DNS manager isn't accepting this string Value: v=spf1 include:hotmail.com ~all it gives an error, something is wrong, either with the string or with the DNS manager and I would like to know how to fix it. Notes: The more information link is dead, Godaddy no longer gives support by email, no Microsoft support

    Read the article

  • Pins on Pinterest link to 404 - yet link is valid

    - by Damian Rees
    this is a really strange one. I've set up a bunch of pins on Pinterest linking through to our services which all work fine. Then I decided to do the same on our blog articles (we use Wordpress), yet everytime I click the link (and I've done this on different computers) the link goes to a 404 page on our site. However the link is valid and if you right click the pin and open in new window it opens fine. I have contacted Pinterest who are next to useless. I have also tried different browsers, different computers and different Pinterest accounts. I can't see any weirdness in my htaccess files causing this so I'm a little stumped. Any suggestions?

    Read the article

  • Ranking hit after site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed Google Webmaster Tools completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic comes from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings.

    Read the article

  • Could Ajax + Caching be seen as cloaking?

    - by Angel
    I have a website where we use a technique to speed up loading times based in a combination of AJAX + caching. Basically, when we have a section in a page with content which is slow to retrieve, we first look if it's cached. If it is, then we serve the content, if it's not, we serve a placeholder and then make an AJAX call in the client to retrieve the content, wich is now cached for subsequent requests. As a consecuence, sometimes you get the entire page content in the first request, and sometimes you get those placeholders, wich get filled inmediatly with the responses of the AJAX request. You can see an example in the results count by category in the right column of this page: http://www.inzoco.com/crits/2-1-3-28-185-0-28079-0-0/listado-piso-en-alquiler-en-madrid-madrid.aspx I'm worried if it could be seen as cloaking by search engines because if you make a request for a page wich content isn't cached and then ask again for the same page, you would get different responses, the first with the placeholders and AJAX requests and the second one with al the content rendered.

    Read the article

  • dns hosting - url forwarding - hiding forwarded url?

    - by jeremycollins
    I have free dns hosting with the domain registrar and I'd like the dns hosted domain www.example.com to display contents of www.myotherlongdomain.com. I only have 301/302/iframe forwarding options, however I want to mask the redirected (longdomain) url. If I use frames, users can view the source and see the (longdomain) url the contents are coming from. How can I hide it so it always displays www.example.com? There is no cloaking/masking option with the registrar. Thanks.

    Read the article

  • Is traditional JavaScript image pre-loading taboo

    - by Evan Plaice
    I remember the good-old-days (not really) back when I was still sucking the teet of Dreamweaver to build websites and the lure of playing copypasta with fancy built-in scripts (ex, image-swap) was like black magic. I'm pretty far removed from that now days but I was adapting a small site from it's original FrontPage (::cringe::) format to a standard HTML/CSS implementation and couldn't help wondering... should I should re-implement the JavaScript image pre-loading into the current version? Or, is there a better way? I don't want to block the page from loading by requiring the user to request all the assets withing the page by using the traditional JavaScript pre-loader method. I value giving the user something to look at ASAP, and there's some potential harm to my Google mojo by doing so. Is there a cleaner solution to prevent unnecessary page-reflows during loading? Such as, setting the static width/height dimensions through a CSS style attribute on the image element.

    Read the article

  • 301 redirect bulk aspx URLs on IIS

    - by tiki16
    We recently relaunched an old ASPX site as a new Drupal site on the same domain. No 301 redirect was implemented. I have outputted a list of 1000 URLs that need to be 301 redirected. Most of the URLs are the results of search queries that were committed on the website. I.E.: http://www.mysite.com/electronics/CommunityDetails.aspx?FirstLetter=%&ID=444 We are running a Drupal site on IIS using a PHP plugin. Is there a way I can wild card a redirect of all ASPX pages? I know I can do it with .htaccess but that doesn't apply here. Any suggestions appreciated.

    Read the article

  • Do keyword-based filenames and URLs really matter?

    - by Justin Scott
    We've developed a dynamic web application which uses URLs such as product.cfm?id=42 but our marketing team says we should use search friendly URLs and put our keywords into the URLs (so it would be product-name.cfm instead). Our developers tell us this will cost more money and take additional time. Is it worth the effort? How important is this to the search engines and will it impact our rankings?

    Read the article

  • Problem with javascript [migrated]

    - by Andres Orozco
    Well i have a problem, and i made this test code to show you my problem. HTML Code -- http://i.stack.imgur.com/7qlZx.png Javascript Code -- http://i.stack.imgur.com/DYvuq.png What i want to do here in this example is to change the styles of the id. But for some reason it doesn't work the bad news is that i have some problem to detect... problems on my code, so i don't know what happen and i really need your help. Thank's. Andres.

    Read the article

  • Trying to Host Server for External Access - Apache, VirtualBox & Portforwarding

    - by Tspoon
    Banging my head on the wall at this stage.... trying to host my Apache site on Ubuntu 12.10 with VirtualBox. Running Windows 8 host. Things I've done: Ensured Apache is listening on ports 80, 443 and 8080 (for thoroughness) tcp 0 0 0.0.0.0:443 0.0.0.0:* LISTEN 3355/httpd tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 3355/httpd tcp 0 0 0.0.0.0:80 0.0.0.0:* LISTEN 3355/httpd tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 681/sshd VM is using bridged network connection Assigned a static IP to my Ubuntu VM, which can be accessed fine from within network. Forwarded TCP ports 80, 8080, 443 on the static IP of VM on my router Given my VM a static NAT Address Turned off Ubuntu firewall and router firewall Read on forums that my ISP (Eircom) allow port 80 to be used And I still can't access my site using the WAN/External IP (checked internally and using CanYouSeeMe.org). It says all the ports I mentioned are closed. I'm really at a loss of what to try next... Am I missing something silly here? Note: I haven't assigned a static IP address within the router, on within the VM. And DHCP server is enabled. Is that bad?

    Read the article

  • Favorite Visual Studio 2010 Extensions, Update

    - by Scott Dorman
    With the release of the Visual Studio Pro Power Tools (and many other new extensions having been released), my list of favorite Visual Studio extensions has changed. All of these extensions are available in the Visual Studio Gallery. Here is the list of extensions that I currently have installed and find useful: Bing Start Page CodeCompare Collapse Selection In Solution Explorer Collapse Solution Color Picker Completion Extension Analyzer Find Results Highlighter Find Results Tweak (Available from CodePlex) Format Document HelpViewerKeywordIndex HighlightMultiWord Image Insertion Indentation Matcher Extension ItalicComments MoveToRegionVSX Numbered Bookmarks PowerCommands for Visual Studio 2010 Regular Expressions Margin Search Work Items for TFS 2010 Source Outliner Spell Checker Structure Adornment This also installs the following extensions: BlockTagger BlockTaggerImpl SettingsStore SettingsStoreImpl StyleCop Team Founder Server Power Tools TFS Auto Shelve Visual Studio Color Theme Editor Visual Studio Pro Power Tools VS10x Code Map VS10x Code Marker VS10x Collapse All Projects VS10x Editor View Enhancer VS10x Insert Debug Names VS10x Selection Popup VS10x Super Copy Paste VSCommands 2010 Word Wrap with Auto-Indent   Technorati Tags: Visual Studio,Extensions

    Read the article

  • How should I prepare the design of a web page for a web developer?

    - by jackal
    What techniques, software or practices do you use to prepare a description of a web page for further development? I am doing some research (with little luck) in how to create description for web developers - what should be included on the web page (inputs widths, font sizes, images placement, etc). Right now I use a combination of Excel and Word documents. In complex cases this is inefficient. Any other suggestions?

    Read the article

  • Configure PHP and Apache in Windows 7

    - by manxing
    I installed apache server successfully in Window 7, 32bit system. It showed "It works" in the webpage. I also configured <?php phpinfo(); ?>file as info.php. But when I tried to open http://localhost/info.php in the browser, all I can get is exactly: <?php phpinfo(); ?>in plain text. I restarted Apache server everytime I made changes. Anyone can help with this? Many tnanks in advance!

    Read the article

  • CSS/HTML element formatting template

    - by Elgoog
    Im looking for a html template that has all the different types of html elements, so then I can take this template and start the css process. this way I can look at the full style of the elements without creating the styles for the different elements as I go. I realize I could do this myself but thought that some one else may have already done this or know where to find such a template. Thanks for your help.

    Read the article

  • Exit link tracking with timestamped logs on 3rd party content

    - by dandv
    I want to track clicks on exit links, that are placed in 3rd party content, for example on Twitter. I also need the timestamps of the clicks. Google Analytics can't be embedded in 3rd party content. Another solution is to use a URL shortener like bit.ly. However, bit.ly or goo.gl don't log the time of the click with any better granularity than a full day. su.pr shows the time for the past day in its analytics graph. The analytics download only includes the day, not the time. cli.gs was touted as having the most detailed analytics, yet it doesn't show the time either, and forces the user through a preview page. Any ideas?

    Read the article

  • Apache returns 304, I want it to ignore anything from client and send the page

    - by Ayman
    I am using Apache HTTPD 2.2 on Windows. mod_expires is commented out. Most other stuff are not changed from the defaults. gzip is on. I made some changes to my .js files. My client gets one 304 response for one of the .js files and never gets the rest. How can I force Apache to sort of flush everything and send all new files to the client? The main html file includes these scripts in the head section of the main page: <script src="js/jquery-1.7.1.min.js" type="text/javascript"> </script> <script src="js/jquery-ui-1.8.17.custom.min.js" type="text/javascript"></script> <script src="js/trex.utils.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.core.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.codes.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.emv.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.b24xtokens.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.iso.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.span2.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.amex.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.abi.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.barclays.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.bnet.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.visa.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.atm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.apacs.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.pstm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.stm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.thales.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.fps-saf.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.fps-iso.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.app.js" type="text/javascript" charset="utf-8"></script> Apache access log has the following: [07/Jul/2013:16:50:40 +0300] "GET /trex/index.html HTTP/1.1" 200 2033 "-" [07/Jul/2013:16:50:40 +0300] "GET /trex/js/trex.fps-iso.js HTTP/1.1" 304 [08/Jul/2013:07:54:35 +0300] "GET /trex/index.html HTTP/1.1" 304 - "-" [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.iso.js HTTP/1.1" 200 12417 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.amex.js HTTP/1.1" 200 6683 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.fps-saf.js HTTP/1.1" 200 2925 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.fps-iso.js HTTP/1.1" 304 Chrome request headers are as below: THis file is ok, latest: Request URL:http://localhost/trex/js/trex.iso.js Request Method:GET Status Code:200 OK (from cache) THis file is ok, latest: Request URL:http://localhost/trex/js/trex.amex.js Request Method:GET Status Code:200 OK (from cache) This one is also ok: Request URL:http://localhost/trex/js/trex.fps-iso.js Request Method:GET Status Code:200 OK (from cache) The rest of the scrips all have 200 OK (from cache).

    Read the article

  • cURL works but PHP cURL fails to internet [migrated]

    - by wrk2bike
    Trying to diagnose an issue using PHP to cURL to an Internet location on a RedHat Linux server. cURL is installed and working, and: <?php var_dump(curl_version()); ?> shows all the correct information in the output. The issue is I can use PHP to cURL to localhost on the box itself, but not the Internet (see below). Normally I'd suspect the firewall, but I can cURL from the command line to the Internet without a problem. The box can also update it's own software packages, etc. What am I missing? My test is: <?php function http_head_curl($url,$timeout=30) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_TIMEOUT, $timeout); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_NOBODY, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $res = curl_exec($ch); if ($res === false) { throw new RuntimeException("cURL exception: ".curl_errno($ch).": ".curl_error($ch)); } return trim($res); } // Succeeds, displaying headers echo(http_head_curl('localhost')); // Fails: echo(http_head_curl('www.google.com')); ?>

    Read the article

  • How do I block a user-agent from Apache

    - by rubo77
    How do I realize a UA string block by regular expression in the config files of my Apache webserver? For example: if I would like to block out all bots from Apache on my debian server, that have the regular expression /\b\w+[Bb]ot\b/ or /Spider/ in their user-agent. Those bots should not be able to see any page on my server and they should not appear neither in the accesslogs nor in the errorlogs. http://global-security.blogspot.de/2009/06/how-to-block-robots-before-they-hit.html supposes to uses mod_security for that, but isn't there a simple directive for http.conf?

    Read the article

  • setting up mod_proxy - plesk, apache, .htacess?

    - by sam
    I want to set up mod_proxy to work so that my blog is running under a subdirectory of my site rather than subdomain so i get the seo backlink benefit. What im looking to do is get my tumblr blog which is running at blog.mysite.com (which is in turn mapped from myblog.tumblr.com) will be running on mysite.com/blog How can i set up mod_proxy to do this, is it just something that i can setup from inside of my .htacess file ? Ive got my site hosted on an apache server, using plesk as a controll panel. I contacted my webhost and they told me mod_rewrite could acheve it, they gave me this but said they wont provide me further support regarding mod_rewrite as its somthing they dont support <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{HTTP_HOST} ^example.co.uk/blog$ RewriteCond %{REQUEST_URI} !/standard RewriteRule ^(.*)$ http://example.tumblr.com$1 [R] </IfModule> ideally id like to use the mod_proxy method as it recomended from an seo point of view from this article http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo

    Read the article

< Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >