Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 92/390 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • Redirect from https://mydomain.com to http://mydomain.com

    - by Charlie
    Many of my visitors have bookmarked my site already wtih https://mydomain.com. Under the bad advice of a programmer, I put my whole Joomla site under ssl. I do not sell anything or provide any member services. I asked him many times if it would slow my site down - he said it wouldn't. I knew it did, I've researched on this site and realized it does slow the site down because of no cache of the pages. Understood. Please, someone tell me how to get away from it now. I'm not sure how to approach this, should I add something to my htaccess or my main index.php file? I've looked all over the net, there is much advice for adding redirectives for going from http to https, but very few answers regarding the opposite of going from https to http. Thank you very much for your time. I appreciate it.

    Read the article

  • get mysql_real_escape is giving me errors when I try and add security to my website

    - by Mike
    I tried doing this: @ $db = new myConnectDB(); $beerName = mysql_real_escape_string($beerName); $beerID = mysql_real_escape_string($beerID); $brewery = mysql_real_escape_string($brewery); $style = mysql_real_escape_string($style); $userID = mysql_real_escape_string($userID); $abv = mysql_real_escape_string($abv); $ibu = mysql_real_escape_string($ibu); $breweryID = mysql_real_escape_string($breweryID); $icon = mysql_real_escape_string($icon); I get this error: Warning: mysql_real_escape_string() [function.mysql-real-escape-string]: Access denied for user

    Read the article

  • Disallow all user agents except one using .htaccess?

    - by Kian Mayne
    I've been struggling to get this .htaccess working. The aim is to disallow all user agents besides my app. The app sends a GET request with a user agent of lets say 'AcmeUpdater'. Whenever I try to navigate to any file in the folder, I get a 500 - Internal Server Error. Here are the rules I'm using: <IfModule mod_rewrite.c> Options +FollowSymLinks RewriteEngine on RewriteBase / RewriteCond %{HTTP_USER_AGENT} !^KMUpdaterClient* RewriteRule .* - [F,L] </IfModule> I have updated the .htaccess file as suggested in the answer by Nick, and restarted Apache. After trying a couple of different things, it seems that just the presence of a .htaccess is causing the 500 error. I'm getting nothing in the error logs. The .htaccess file at the document root looks like the following: <IfModule mod_rewrite.c> Options +FollowSymLinks ErrorDocument 404 /index.php?error=404 RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d </IfModule> So I realised that the error logs were in chronological order rather than the reverse chronological I expected (Oops!). The error I'm getting is: </IfModule> without matching <IfModule> section. I removed the </IfModule> and still I get that error. Ideas?

    Read the article

  • What is the role of web hosting in SEO [closed]

    - by Vinay
    Possible Duplicate: Does changing web hosting server affects SEO page ranking? SEO Geolocation What are the best ways to increase a site's position in Google? How to find web hosting that meets my requirements? I have read somewhere that hosting providers do play a role in website SEO, As my website is hosted in yahoo small business, That has got analytics and some other tool they provide to check the keyword activity, I think these can be achieved with the google's analytics as well, Server performance and Uptime is one important factor. I have also got few doubts in my mind 1) Does shared hosting affect the SEO and what is the role of domain extension like .com, .in, .org ,etc. 2) Does server geolocation affects the SEO 3) Does server OS affects the SEO. Apart from the above, Is there any factors that affect the SEO One more last question that If hosting really matters lot, can you suggest me a web hosting service for a small business e commerce site for PHP

    Read the article

  • Free Domain hosting configurations and transfer

    - by upog
    I have registered a new domain name with GODaddy.com now i would like to host my domain for free. Assume the app is a basic HTML page.I have done some search and decided to host it under google app engine I am looking answers for few question currently my domain name is managed by GODaddy, how can i transfer it to Google app engine, so that going forward it will be managed by Google How can i configure the new domain in Google app engine and associate with my domain name Is there any indirect cost involved in domain hosting service hosted by Google app engine Any suggestion for free and reliable hosting is also welcomed Update Can i host free web page in cloud.google.com ?

    Read the article

  • How web server choose between unicode and utf-8 for accentued characters?

    - by jacques
    I have a web server with my ISP which replaces in the urls the accentued characters by their unicode values: for instance é (eacute) is translated to %e9 (dec 233). For testing I use locally Easyphp which translate those characters by their utf-8 equivalence: é is then replaced by the well known sequence %c3%a9 (é)... Browsers served by Easyphp don't decode unicode values but they do if running locally (utf-8 and non converted accent also)... I have been unable to find where this behavior is configured in the server. This is a problem as some urls are built by my application using the php rawurlencode() which seems to always encode with unicode values on both servers. Any idea? Thanks in advance.

    Read the article

  • 500 error on Joolma website

    - by Rachel Sparks
    PHP Fatal error: Call to a member function setQuery() on a non-object in /home/josh/public_html/administrator/components/com_jfusion/plugins/phpbb3/forum.php on line 226 Just moved over to a new server. Anyone have ideas as to what is wrong? Is this a database issue? line 226: //get permissions for all forums in case more than one module/plugin is present with different settings $db = & JFusionFactory::getDatabase($this->getJname()); $query = "SELECT forum_id FROM #__forums WHERE forum_type = 1 ORDER BY left_id"; $db->setQuery($query); //226 $forumids = $db->loadResultArray();

    Read the article

  • Javascript slider Image and text from php, scrollable in groups by indexes

    - by Roberto de Nobrega
    I am looking for a javascript solution that slides images with text, pulled from php. This slider will slide in groups by indexes in points. I was googling, but nothing as I need. I am going to make an example. Imagine 10 products. I need to show the principal picture, and a text below the image. It is going to show 6 products, and with points (indexes), I click and the group slides to the next group. Do you know some script.?? I know the php code, but I am a newbie with javascript.! Thanks.!! PD. I am lost of where i have to put this question. So, If this was a wrong place, let me know, and accept my apologises.! ;)

    Read the article

  • How to interpret number of URL errors in Google webmaster tools

    - by user359650
    Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors: What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd). Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase: Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time. Q: Am I interpreting the number of errors correctly?

    Read the article

  • Facebook page design is not working in IE8 [closed]

    - by PrateekSaluja
    Hello Experts, We have designed a face book page.It is working fine in all browser including IE7 but it is not working in IE8.We checked then we got if we run our code outside the face book page it works in IE8 but when we put our code into face book page its not working.Here is the css code what we are using for IE8. <!--[if lt IE 8]> <style> .nv_a { width:90px; height:27px; float:left; text-align:center; padding-top:8px; } .nvt_a { width:66px; height:27px; float:left; text-align:center; padding-top:8px; } .nv_a a { width:90px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; } .nv_a a:hover { width:90px; height:27px; padding-top:8px; float:left; color:#000; text-align:center; background-color:#ccc; } .nvt_a a { width:66px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; border:1px solid red; } Please help us to solve the issue.

    Read the article

  • What a web developer can learn [closed]

    - by knoxxs
    There are many things to learn in web development. You can easily find what are the most important thing that you need to learn if you want to be a webmaster. Answer to questions about how to become a web developer or a webmaster only contained limited items that someone need to master. (Some eg - a, b ) But the problem is that these resources are not complete. When I started learning web development i follow the same steps. But after learning the basic development I didn't know that I have learnt nothing, there are many more things to learn. I realized this by following blogs , Q&A sites. When I first downloaded the HTNL% Boilerplate, the issue that they have covered, some of them I haven't even heard about. I want you to just suggest what are the possible things, issues that someone can learn and why to learn. I know the answer is follow blogs and do your work you will learn with time, but with these platforms I could get some benefit out of other experiences. This question is not how to become a webmaster, but answer to this may also cover that too.

    Read the article

  • What to look for in a free hosting plan? [duplicate]

    - by Jon
    This question already has an answer here: How to find web hosting that meets my requirements? 5 answers I have a test website that's hosted on a free plan by Zymic. At the moment I'm typing, it and my site is down. I don't want to let my clients down in the future. It's been down for over 2 days. I thought it was a coding problem at first, and then found out I couldn't connect to my server. Zymic had very good reviews, and its downtime was OK (not high or low), but now I want to change my web host. What should I look for (besides downtime guarantee)? Also, do you have any suggestions that with all the benefits? Any feedback will be greatly appreciated.

    Read the article

  • Webpage redirection time

    - by Abhijeet Ashok Muneshwar
    I want to calculate time consumed in redirecting from 1 webpage to another webpage. For Example: 1) I am using Facebook in Google Chrome browser. I have shared 1 link on my Facebook profile like below: http://www.webdeveloper.com/ (It's not only Facebook. It can be any domain having link to another domain). 2) When I click on this link from my Facebook profile, then this website will open in new tab. 3) I want to calculate time difference in miliseconds or microseconds between below two events: First Event: Time of clicking link "http://www.webdeveloper.com/" from my Facebook profile. Second Event: Time of completely loading webpage of "http://www.webdeveloper.com/". Thank you in advance.

    Read the article

  • DNS records on website.. What are they for?

    - by Blake Nic
    Recently we had to get some ddos protection for our website because of the large attacks we were seeing after getting a bit of popularity. We handed over our domain and hosting information to our ddos protection provider. It worked perfectly but I have a question. On our DNS records we have the Host and Answer and Type. The Host has our domain name there. The answer is this: SOMETEXTXXXX.dv.googlehosted.com. And when i copy and paste it into my browser it gives me a 404 error. But our website still loads and functions as it should. I don't understand why it would need this? I asked them about this and they said it is a method for ddos protection and the other IPs are the reverse proxy (the other ips give a 404 error too). Can anyone expand on this more please. How does all this tie in together and make the internet browser know where to point the person with all these reverse proxies and stuff I don't understand. Thank you. Here is an image for reference: http://i.stack.imgur.com/qo5QO.png

    Read the article

  • Dropped impression 25 days after restructure

    - by Hamid
    Our website is a non English property related website (moshaver.com) which is similar to rightmove.co.uk. On September 2012 our website was adversely affected by Panda causing our Google incoming clicks to drop from around 3000 clicks to less than a thousand. We were hoping that Google will eventually realize that we are not a spam website and things will get better. However, in August 2013 we were almost sure that we needed to do something, so we started to restructure our web content. We used the canonical tag to remove our search results and point to our listing pages, using the noindex tag to remove it from our listing pages which does not have any properties at the moment. We also changed title tags to more friendly ones, in addition to other changes. Our changes were effective on 10th August. As shown in the graph taken from Google Analytics Search Engine Optimization section, these changes has resulted in an increase in the number of times Google displayed our results in its search results. Our impressions almost doubled starting 15th August. However, as the graph shows, our CTR dropped from this date from around 15% to 8%. This might have been because of our changed title tags (so people were less likely to click on them), or it might be normal for increased impressions. This situation has continued up until 10th September, when our impressions decreased dramatically to less than a thousand. This is almost 30% of our original impressions (before website restructure) and 15% of the new impressions. At the same time our impressions has increased dramatically to around 50%. I have two theories for this increase. The first one is that these statistics are less accurate for lower impressions. The second one is that Google is now only displaying our results for queries directly related to our website (our name, our url), and not for general terms, such as "apartments in a specific city". The second theory also explains the dramatic decrease in impression as well. After digging the analytic data a little more, I constructed the following table. It displays the breakdown of our impressions, clicks and ctr in different Google products (web and image) and in total. What I understand from this table is that, most of our increased impressions after restructure were on the image search section. I don't think users of search would be looking for content in our website. Furthermore, it shows that the drop in our web search ctr, is as dramatic of the overall ctr (-30% in compare to -60%) . I thought posting it here might help you understand the situation better. Is it possible that Google has tested our new structure for 25 days, and then decided to decrease our impressions because of the the new low CTR? Or should we look for another factor? If this is the case, how long does it usually take for Google to give us another chance? It has been one month since our impressions has dropped.

    Read the article

  • How to increase backlinks of blog or websites [duplicate]

    - by Adarsh Sojitra
    This question already has an answer here: How do I build backlinks? 5 answers I know that this question is very easy and also Silly.But I don't know how to make backlinks to my blog.I have tried commenting in various blogs and websites but in alexa there is only 1 backlink which is my own blog.Do anyone know how to make a quality backlinks for blog or website....I also want to know that by increasing Backlinks, SEO of my blog imporves???Thanks in advance......

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • Sitemap structure for network of subdomains

    - by HaCos
    I am working on a project that's a network of 2 domains (domain.gr & domain.com.cy) of subdomains similar to Hubpages [each user gets a profile under a different subdomain & there is a personal blog for that user as well] and I think there is something wrong with our sitemap submission. Sometimes it takes weeks in order a new profiles to get indexed. We make use of one Webmasters account in order to manage all network and we don't want to create different accounts for each subdomain since there are more than 1000 already. According to this post http://goo.gl/KjMCjN, I end up on a structure of 5 sitemaps with the following structure : 1st sitemap will be for indexing the others. 2nd sitemap for all users profile under the domain.gr 3nd sitemap for all users profile under the domain.com.cy 4th sitemap for all posts under the *.domain.gr - news sitemap http://goo.gl/8A8S9f 5th sitemap for all posts under the *.domain.com.cy - news sitemap again Now my questions: Should we create news sitemaps or just list all post in 2nd & 3rd sitemap? Does links ordering has anything to do? Eg: Most recent user created be first in sitemap or doesn't make any different and we just need to make sure that lastmod date is correct? Does anyone guess how Hubpages submit their sitemap in Webmasters so maybe we could follow there way? Any alternative or better way to index this kind of schema? PS: Our network is multi language - Greek & English are available. We make use of hreflang tags on the head of each page to separate country target of each version.

    Read the article

  • fully encrypt website using SSL

    - by eddywebs
    I had been trying to use SSL for the following site http://bit.ly/e8Lj32 , although the SSL certificate is signed properly by networksolutions , each time the pages are loaded it still displays an SSl warning in browser warning "Some parts of the site are not using SSL" , in I.E, its even worst if you hit "no I dont want view unsecured part of the page" site does not display properly (as it blocks some of the widgets) screenshots upped at http://i.imgur.com/fm5GO.png

    Read the article

  • 410 Responses when your CMS host doesn't support them?

    - by leeand00
    Sending a 410 responses for a page that no longer exist should make Google stop crawling for that page. The site I am working on has been recently migrated, and very little of the content was migrated. I've already turned the existing content into 301 redirects (the content that is on both the old and the new site), but now I would like to flush the old content from Google's memory by placing 410 responses in it's path when it returns to crawl for them and finds a 404 response. However, I asked our CMS host about it, and they said that our CMS does not support 410 responses. Is there some other way to post a 410 response, like making a dead link 301 redirect to a page that a 410 response in the form of a meta tag?

    Read the article

  • Apache + Lighttpd serving from same Domain name

    - by Alex Pineda
    So we wish to host some pages on a new server w/ apache2, and embed some of our old content & functionality from another server w/ lighttpd in an iframe. I'm looking at this configuration from the apache docs (http://httpd.apache.org/docs/2.2/vhosts/examples.html#page-header) under "Using Virtual_host and mod_proxy" together. <VirtualHost *:*> ProxyPreserveHost On ProxyPass / http://192.168.111.2/ ProxyPassReverse / http://192.168.111.2/ ServerName hostname.example.com </VirtualHost> The only issue is that I want to proxy only on a subdomain, or even better, if I can keep the top domain and proxy only if the url contains a particular path ie. "/myprocess.php". So in essence the DNS will point to the apache2 as the "master router".

    Read the article

  • Will domain change affect my pagerank?

    - by Chankey Pathak
    I have two blogger's blog. (http://chankeypathak.blogspot.com and http://javaenthusiastic.blogspot.com) One blog has PR 3 and the other blog has PR 2. I want to buy the domain for both blogs so that they will become http://chankeypathak.com/ and http://javaenthusiastic.com/ I will follow all the procedures that Blogger suggests so that all the visitors to http://chankeypathak.blogspot.com will be redirected to http://chankeypathak.com/ and same for the java's blog. I just want to know that whether this will affect my pagerank or not? I want my PR to remain same and not to be change because of domain change. Let me know. Thank you. PS: I don't know whether one person is allowed to post his site's URL in questions or not. If it is not allowed then you may edit the question.

    Read the article

  • How do search engines handle hyphenated words?

    - by NinjaKC
    I am not sure my title fully explains what I mean. I thought this might be an interesting question. If I had a set of keywords, broken with a dash or 2, will search engines consider the dashed split keyword as maybe a full keyword? Say I have a site that sort of breaks words down, like the dictionary sites do. So a keyword for that page, might end up in the page, and / or the URL, as broken by dashes. Key-word = keyword Co-op-er-at-ive = cooperative Pho-to-gra-phy = Photography www.example.com/key-word/ www.example.com/co-op-er-at-ive/ www.example.com/pho-to-gra-phy/ I know search engines will consider a dash (at least Google) as a space, and understand it as multiple words. But in the English language, a dash can also break a word down (at least I think it can, can't it?), so will search engines also take this into consideration? I did a 'little' research, I Googled some words and placed random dashes, and it returned the words I searched for, but this could be considered a typo from the user on Google's search end, so really I am wondering if I can purposely put a dash in a keyword, and have the search engine spiders still catch that keyword as the real word without dashes? I've done a little Googling and looking here on Stackoverflow, but everything comes down to dashes for multiple words, not really the specific thing I'm trying to figure out. Hopefully that makes sense, I am not an expert in SEO, yet, but get the basics and have been playing, and this is just really a random question to satisfy my knowledge of playing :P

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

  • Screencast several application windows at once in Microsoft Windows

    - by Birt
    I have several (20+) applications running on a Microsoft Windows PC. What I would like is a solution that allows me to broadcast the window of each application in a webpage, in readonly mode (there's no need for the users to interact with it). This should work even if the application is in the background, seeing that there's no way to fit all of them on the screen. I performed very extensive searching, from simple screencasting apps such as Camtasia, CamStudio or VHScrCap to things like VNC (haven't found any server able to broadcast multiple windows at once, much less background windows) and even application virtualization, but in the end I haven't found anything that fits my needs. Most solutions that allow capturing a window instead of the whole desktop will not let you capture multiple windows but only a single window and on top of that they don't even work when the window is in the background.

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >