Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 212/390 | < Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >

  • Suddenly my server reject all Post Requests

    - by Sharen Eayrs
    just go to meet-romance.com/test.htm The script there is simple. A form with a button <form action="test.htm" method="post"> <input name="Button1" type="submit" value="button" /> </form> It doesn't work. Press the button in firefox and I got connection reset thingy. I wonder why. It happens since yesterday. I have emigrated all domains that requires post requests somewhere else. I suppose a reset of server would fix that only to happen again some other time. So I wonder if anyone has a clue of why. All domains that require post have been moved to another server.

    Read the article

  • Duplicating content from another site and adding value (summaries, statistics) - ranking and courtesy

    - by Krastanov
    I am working on a site that takes a governmental data base, provides a number of statistical and other summaries and also post the original data. However this data (mostly long pieces of text) is also published on the official governmental site (without the added value of summaries). Should I worry about google ranking due to this duplication? What is the preferred way to point to the official source of the information? There is no advertisement on my site. My site is ".com". The governmental site is ".bg".

    Read the article

  • I need a backend system that is integrated with web services, is there an open source solution?

    - by Jarom
    I'm basically familiar with what I need in order to setup web services to talk with a centralized db, but if I don't have to go through and do all the work, I'd rather not. Is there an open source solution that would allow me to easily integrate web services for data transfer to a central db? I want to make a site that is powered by a db that can also be accessed by other things like mobile apps for example. What are the steps involved in setting up such a site? Any help is appreciated! I could use all the help I can get!

    Read the article

  • What is the benefit of the "download will begin shortly" page?

    - by Fammy
    I've noticed many websites that host files for downloads have an interstitial page between the download link/button and the actual start of the download. Terminology on the page may include "Your download will begin shortly. If it does not, try this direct link". What is the purpose of this page? It seems to draw away from the general experience of downloading a file. Is this beneficial for bookmarking? Less experienced users? Analytics?

    Read the article

  • How will this affect my SEO ranking?

    - by dunc
    I run a fishkeeping website based on a WordPress (PHP) CMS. I've recently put a fairly complex "filter" into place which searches my content for mentions of fish species profiles and turns them into an active link. For example, asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and A. panduro and Apistogramma panduro ...becomes asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and <a href="/?p=1703" class="link_species">A. panduro</a> and <a href="/?p=1703" class="link_species">Apistogramma panduro</a> On the rest of my website, the species are linked with pretty URLs such as /species/apistogramma-panduro/ but due to the way this filter works, the only information I can get access to is the idof the post. As such, I'm using /?p=1703 or whatever the ID is. What I'd like to know is: how much will this affect my SEO rating/ranking? Will it be detrimental if I don't rewrite the function? Thanks in advance,

    Read the article

  • Strategy for image sizes

    - by MotiveKyle
    I run a site that has a lot of writers that generate quite a few articles a day. I require them to provide two image sizes (one for the big headline image and one as the thumbnail). I've been wanting to change up the site layout a bit, but I am becoming limited by the image sizes for the posts. I have considered just cropping images, but they still need to look nice, and cropping doesn't always generate what I'd want. I'd prefer to just scale down by percentage (as I do with thumbnails). Should I just make the writers provide more images? How do other sites handle this?

    Read the article

  • Getting IP/WHOIS information from users clicking ads in Google AdWords?

    - by nctrnl
    I am running a Google AdWords campaign and am wondering if I can get any information from the people who click my ads? I already collect statistics on the website, but I get no referer information from the Google ads in the search network, only from the display network. I am also running Google Analytics on the website but I can't find any relevant information there. P.S: Tag suggestion: google-adwords

    Read the article

  • How should I setup billing for AdWords when managing a client's campaign in My Client Center? [closed]

    - by Dustin
    I have worked with Google AdWords before and will now be managing an AdWords account for a client. I have a My Client Center account, but I'm wondering what the best practices are for billing. Should I link billing to my own credit card and then have the client pay me (they have to pay me to manage the account anyway), or should I have the client pay Google directly? How is this usually done? If it is the later, what is the best way to have them input their payment info?

    Read the article

  • SEO perspective on non existent directory base in URL?

    - by Sandro Dzneladze
    I'm wondering if there will be any SEO/readability/memorability benefit to using this kind of URL structure for my upcoming project: www.moviereviews.com/movie/name? Considering that /movie is not a real directory. So that page doesn't exist. Something similar to wordpress /category/ base that is used purely for content separation on the site. What do you think? For user it will be beneficial, if domain doesn't signal what content is about my extra dir will tell what it is about. Correct? But from SEO perspective?

    Read the article

  • How to handle multiple pages of the same site with the same outlinks

    - by pandafromchina
    I am developing a back link tool for Chinese SEO (our web site URL is: http://link.aizhan.com just like ahrefs.com. I encountered a problem which is how to handle multiple pages of the same site with the same out links. For example: Most pages of bbs.chinaz.com have the same out links such as: bbs.chinaz.com/Tea/thread-6293993-1-1.html bbs.chinaz.com/Tea/list-1.html bbs.chinaz.com/alimama/thread-6265032-1-1.html bbs.chinaz.com/alimama/thread-6265032-2-1.html?userid=-1&extParms= bbs.chinaz.com/Shuiba/list-1.html bbs.chinaz.com/FeedBack/thread-4456753-1-1.html etc.. All of the pages have the same out links in the top of the page: www.cnzz.com(anchor text:????) www.313.com(????) www.idc123.com(????) Suppose I store these outlinks into database. The SEO will find there are six backlinks from bbs.chinaz.com of www.cnzz.com. This is obviously no sense for the SEO. Can you tell me how do you deal with this problem?

    Read the article

  • Why Can't Computers Off My Network See the Site? [migrated]

    - by nmagerko
    Have just set up Apache, PHP, MySQL, etc. on my Ubuntu OS, and I was wondering why computers that are not on my network can not see the basic index.html that Apache uses as the default. I set up the static ip address for my computer, and I use 192.168.1.100 for computers to view the simple site. Is there something I am missing that will allow others to access my site? (It is REALLY simple; no graphics, CSS, etc.)

    Read the article

  • reasonably priced registrar for obscure tlds like .tt

    - by Stu
    I'm looking to buy a .tt domain name, however the only registrars I can find want from between $500 to $3000 for one domain. Considering I can buy a .com for around $10 a year, I consider that not very reasonable! It's going to be for a personal blog site (non monetised), hence why I'm not willing to spend over $500 a year on it. Does anyone know of any registrars that sell obscure tld's such as .tt for reasonable price? As for my definition of "reasonable", I understand it's not a .com and I'm going to have to pay more, but $3000 is just silly! In my opinion I'd say anything under $100 is reasonable.

    Read the article

  • How do I install PHP 5.4.8 on a Win XP workstation?

    - by Cyberherbalist
    I am trying to create a MediaWiki site on my WinXP workstation, and need to install PHP 5.4.8. I have downloaded and unpacked the binaries for PHP, but the install instructions are telling me to "Run the MSI installer and follow the instructions provided by the installation wizard." Unfortunately there is no MSI file provided in the package! Am I missing something? Besides a clue, I mean. Edited to add: I will try to make this clear. I am not going to be able to work with Apache.

    Read the article

  • How to include content from remote server while keeping that content secure

    - by slayton
    I am hosting collection of videos, for which I retain the copyright, on a file server that I'd like to share with family and friends. When a user visits the my fileserver via a web browser they are asked to authenticate using HTTP auth and then they are presented with a basic list of the files. I'd like to build web application that provides a clean interface with simply library functionality. However, this app will be hosted on a different server. I'm trying to figure out a security model for my file server that doesn't require the user to login to both the file-server and the hosting-server. I want to make this as easy as possible for my non-tech savy family while still maintaining security for my files.

    Read the article

  • How to batch remove spamming users and pages they created on MediaWiki?

    - by Problemania
    I'm trying to clean up a MediaWiki instance which has been subjected to spamming and vandalism for a period of time. The current status is that there are a large number of users which only created spam pages but typically not altered legitimate pages. And there is only < 10 users which I know are legitimate users and created a small number of legitimate pages. Abstractly, my idea of fixing the messy situation is to find the complete list of users that are not in that small set of legitimate users, and use RenameUser extension to rename them all to a Spammer user, and use Nuke extension to mass delete all pages it created. Any practical advice on how to proceed? Since there are hundreds of spammer users, how do I effectively rename them? It seems Renameuser extension does not support automated batch renaming of users by allowing users to be renamed with a list or file.

    Read the article

  • Why do Google search results include pages disallowed in robots.txt?

    - by Ilmari Karonen
    I have some pages on my site that I want to keep search engines away from, so I disallowed them in my robots.txt file like this: User-Agent: * Disallow: /email Yet I recently noticed that Google still sometimes returns links to those pages in their search results. Why does this happen, and how can I stop it? Background: Several years ago, I made a simple web site for a club a relative of mine was involved in. They wanted to have e-mail links on their pages, so, to try and keep those e-mail addresses from ending up on too many spam lists, instead of using direct mailto: links I made those links point to a simple redirector / address harvester trap script running on my own site. This script would return either a 301 redirect to the actual mailto: URL, or, if it detected a suspicious access pattern, a page containing lots of random fake e-mail addresses and links to more such pages. To keep legitimate search bots away from the trap, I set up the robots.txt rule shown above, disallowing the entire space of both legit redirector links and trap pages. Just recently, however, one of the people in the club searched Google for their own name and was quite surprised when one of the results on the first page was a link to the redirector script, with a title consisting of their e-mail address followed by my name. Of course, they immediately e-mailed me and wanted to know how to get their address out of Google's index. I was quite surprised too, since I had no idea that Google would index such URLs at all, seemingly in violation of my robots.txt rule. I did manage to submit a removal request to Google, and it seems to have worked, but I'd like to know why and how Google is circumventing my robots.txt like that and how to make sure that none of the disallowed pages will show up in their search results. Ps. I actually found out a possible explanation and solution, which I'll post below, while preparing this question, but I thought I'd ask it anyway in case someone else might have the same problem. Please do feel free to post your own answers. I'd also be interested in knowing if other search engines do this too, and whether the same solutions work for them also.

    Read the article

  • How will the search rank get impacted if i move my mobile website to a single page application?

    - by rahul
    I have two different versions of my site. A desktop version and a mobile optimised version. That is for the same url the server renders different html for different user agents. I had been using vary header for this scheme as recommended by Google. However, now i want to move the mobile website to a single page application for several reasons. I want to know if google stops seeing anything on my mobile web version but the desktop version continues to work as it is, then how would the search rank be impacted given that mobile web gets more traffic than the desktop version. How would the vary header come jnto play

    Read the article

  • Domain in PENDINGDELETE, question about drop

    - by kcdwayne
    A domain I want is in the pendingDelete stage according to WHOIS. I have been monitoring it since redemptionPeriod, and it entered into pendingDelete 5 days ago today. After checking a few services (SnapNames, etc), they report it is scheduled to drop on the 11th (7 days, by my calculations). I'm not quite sure what to believe. The domain isn't highly valuable. It is to me and one other company. I can see no backorders placed on the big name sites, so I'm thinking of trying to get it without a backorder service. Any insight as to when it will actually drop? I've read 11AM-2PM PST, but I'm unsure. Thanks.

    Read the article

  • What is an easy way to see how often recently added pages are viewed in google analytics?

    - by cboettig
    Google Analytics makes it very easy to see the number of views of the most-viewed pages, but I cannot figure out how to see the number of views a particular page has received, or the number of views of recently added pages (e.g. blog posts). Is it possible to sort the pageviews list by date the page was added? Can this be done without having to externally create a list of recent pages and use the analytics API?

    Read the article

  • Web host recommendation [closed]

    - by birdus
    Possible Duplicate: How to find web hosting that meets my requirements? I'm researching a web host for a client and am looking for any recommendations of hosts you may have used and been happy with. Here are the requirements I've been given: The hosting service needs to either provide or allow us to add the following functionality: i. ASP/ASP.Net ii. video streaming iii. audio streaming iv. reporting v. RSS feeds vi. site search vii. forums viii. podcasts ix. Flash x. CMS: looking at using Percussion Software xi. PII registration xii. tie into SF.com (Sales Force) They also want to have a pre-prod server available so they can test the website before going public with it. This may just be a matter of paying extra for another site/server. Thanks for the help.

    Read the article

  • Premium Cpanel Accounts Give Away

    - by mamta
    I found Book My Cloud.com - Offering FREE Cpanel Hosting accounts Method to request is contacting their support team, Request Now: hxxp://support.book my cloud.com/index.php?a=add Select Request Type Free Cpanel Hosting Related They provide, Instant Activation Disk quota : 10 GB Monthly bandwidth : 300 GB Max FTP Accounts : 5 Max Email Accounts : Unlimited Max Email Lists : Unlimited Max Databases : 500 Max Sub Domains : 500 Max Parked Domains : 100 Max Addon Domains : 1000 Control Panel: Cpanel 2012

    Read the article

  • Working with Google Webmaster Tools

    - by com
    My first question is about Crawl errors in Google Webmaster Tools. Crawl errors is devided into few sections. One of them is HTTP. I assume that all broken links in HTTP was somehow found by crawler, this is not the links from sitemap. If this was found by scanning all sitemap pages for links, why it doesn't mention what was the source page, like in sitemap section with column Linked From. And what the meaning of Linked From, I thought if the name of section is sitemap, therefore all URLs should be taken from sitemap, so why there is Linked From? The second question, what is the best way to trreat searching on the site. How come the searching result page are getting indexed? Because of the fact that all searching result page are getting indexed, I have to many page in Linked From. What's the right practice? Question three: In order to improve response time in WMT, can I redirect all crawler's requests to designated free web server? Is this good practice? Question four: How should I treat Google Analytics Code (with parameters PageView, PageLoadTime), in the case user request non existing page, should I render Google code or not? Right now I use Google Analytics Code on the common template page, such that every page, also non existing page with error message contains Google Analytics Code, it seems like it has influence on WMT.

    Read the article

  • What could have caused a large traffic drop from Google in early May?

    - by Scott Schluer
    I have a website (www.equispot.com) that has been indexed for almost 2 years in Google. I managed to get myself on the first page (average position 6-8) on Google for my target keyword of "horses for sale" and held there pretty solidly for months. Suddenly, with no changes to the site, traffic from Google dropped like a rock in early May. I slowly fell in position until now I'm sitting at the bottom of page 4. I have never hired an SEO firm, have not used any "black hat" techniques that Google would have penalized me for in their May update, etc. I'm not familiar enough with SEO to know how to look at link profiles, etc. to tell if there's something wrong. I've run my site through a DNS checker and it came back with no errors. Google Webmaster Tools shows no messages or notices of any kind, just a drop in traffic. GWT also shows only 2 server errors and 1 404. Is there anyone who can tell me by quickly checking my domain if there's an obvious reason that my traffic would have fallen so far, something that I can fix?

    Read the article

  • How to make Facebook like button narrower than 225px?

    - by tog22
    I'm generating a like button with Facebook's 'standard' layout for my site via https://developers.facebook.com/docs/reference/plugins/like/ . I've set its width to 200 pixels, but notice that setting it to lower than 225 pixels has no effect, and the documentation on that page indeed specifies 225px as the minimum width for the standard layout. Unfortunately I need to make it 200 pixels wide to fit my site's design. Is there any way to force it into this width? (The site's at http://gwwc2.centreforeffectivealtruism.org/ if you want to have a play with Firebug, though the like button gets generated by javascript so you'd probably have to duplicate that page and edit its source.)

    Read the article

  • How to automatically generate html table from image in Linux?

    - by alfish
    In Photoshop, you can easily devide the image into zones using point and click and it automatically generates the corresponding html with image slices addressed in tables. Gimp also has a Slice (Filter Web Slice) but it is so rudimentary and, as far as I can see, does not allow point and click selection of slices. I am wondering if the functionality can be added into Gimp, or there are other Linux software to do this. I hate to return to Windows jut to do this simple task which I happen to use frequently. Thanks in advance for your suggestions.

    Read the article

< Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >