Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 109/389 | < Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >

  • How can I allow robots access to my sitemap, but prevent casual users from accessing it?

    - by morpheous
    I am storing my sitemaps in my web folder. I want web crawlers (Googlebot etc) to be able to access the file, but I dont necessarily want all and sundry to have access to it. For example, this site (superuser.com), has a site index - as specified by its robots.txt file (http://superuser.com/robots.txt). However, when you type http://superuser.com/sitemap.xml, you are directed to a 404 page. How can I implement the same thing on my website? I am running a LAMP website, also I am using a sitemap index file (so I have multiple site maps for the site). I would like to use the same mechanism to make them unavailable via a browser, as described above.

    Read the article

  • how to save a png as a smaller file but the same resolution?

    - by Radek
    not sure what stackexchange site is the best for this question. I have a scanned jpg file with below properties and size 8.5MB pixel dimension: 2468 × 3484 pixels print size: 208.96 × 294.98 millimeters resolution: 300 × 300 ppi I need to save the file as png while the size cannot be bigger than 4MB. The most important is that the size of picture must remain the same. I mean that the object size in the picture must stay the same. Could anybody tell me what is used to define the size of the objects in the picture?

    Read the article

  • How can I find out if my domain has been added to email blacklists?

    - by Rob Sobers
    We do a lot of mass emailing of our contacts to promote events, send out newsletters, etc. Some people read and react, some people unsubscribe, but I fear that some might actually mark the email as spam. Is there any way to figure out whether my domain has been added to email blacklists or spam registries? Also, if I use a service like MailChimp to send the emails, how would this work? If one unscrupulous customer was using MailChimp for evil, wouldn't it affect all of their customers?

    Read the article

  • How can I host a website on a dynamically-assigned IP address?

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this: Convince my ISP to give me a static IP Assign my router my current IP to force a static IP (which might work?) Set my DNS record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Which of these options will work most reliably?

    Read the article

  • Best ways to collect location-based user input

    - by user359650
    I'm working on a website where users will be able to register and provide information about their location. In order to prevent users from inputting incorrect data, we don't want users to provide free-text information but instead choose from predefined values as much as possible. We believe there are 2 ways of providing those values: use an API to an external service provider or create your own local database. APIs Some resources: - https://developers.facebook.com/docs/reference/ads-api/get-autocomplete-data/ - http://developer.yahoo.com/geo/geoplanet/ Pros: -accuracy and completeness of data. -no maintenance related to update of data as this it taken care of by API provider. -easier/faster to get started (no need to create local database, just implement API). Cons: -degradation of performance when availability issues with external API. -outage due to changes to the external API (until your code is updated to reflect those changes). -lock-in with external provider. Local database Some resources: - http://developer.yahoo.com/geo/geoplanet/data/ - http://www.maxmind.com/app/geolitecity - http://download.geonames.org/export/dump/ Pros: -no external dependency: improved stability and performance. Cons: -more work to get started (you need to create the database and code to interact with it). -risks of inaccurate/incomplete data, either initially or over time. -more maintenance work to keep database up to date. Assuming the depth information requested from users is as follows: -country: interested in value. also used to narrow down list of regions. -region (state in the US, county in the UK...): not interested in value itself, only used to narrow down list of cities. -city: interested in value (which can be used to work out related region should we need regional statistics). -address: interested in value although OPTIONAL. Which option (whether API or local database) would you choose? What tips you would give for the implementation? What other resources can you share?

    Read the article

  • Blogger widgets speed problem

    - by Wladimir Ivanov
    Recently I installed Google Analytics on a Blogger account. I was shocked when I saw load times for the landing pages between 10 and 60 seconds. The blog uses Facebook like-box, twitter recent messages box, live traffic feed widget and Lockerz share buttons. Almost every post in this blog contains YouTube iframes which aren't nowhere near fast. Are there any well-known solutions for this type of problems? Should I use some jQuery plugins for speed optimization, how can I make the facebook/twitter boxes load faster?

    Read the article

  • How long should my Html Page Title Really be?

    - by RandomBen
    How long should my text within my <title></title> tags really be? I know Google cuts it off at some point but when? When I used IIS7's SEO Toolkit 1.0 I get error stating my title should be under 65 characters. I have a book by Bruce Clay that states I should use from 62-70 characters and roughly 9 +/- 3 words. I also have used SenSEO's Firefox Add-on and it states I should use a max of 65 characters or roughly 15 words. What is the max really? I have 2 sources saying 65 and 1 saying 72 but Bruce Clay is generally kept in high regard.

    Read the article

  • Hosting multiple client website on single

    - by Bhavesh Gangani
    I'm WebDesiner and i've currently only a few clients for making website. i've unlimited hosting account and i want to host their websites in my account without reseller account ( actually it is not needed for constness). only my client's need is ftp access to their personal directory. so as i questioned it is possible to give them saperate phpmyadmin access in this strategy ? as per my knowledge it is done with "addon" domain pointing on my hosting account's directory with cpanel, am i right ? or there is another solution for it except reseller account ?

    Read the article

  • Joomla and Google Analytics advanced options in tracking code

    - by miako
    I want to insert google analytics tracking code in my joomla site. so i registered in the official site of google and saw there is an advanced tab with three more options than standard. Do i have to check "i want to track dynamic pages" and "i want to track php pages"? Do these options provide me better results or they are necessary for a dynamic site based on php like joomla? Does anyone know the process of installing? because i didn't manage to make it work by following this Also where do i place the tracking code? Because of some bugs some say it is better just after the tag <body> whereas other say just before the tag </body>. Thank you

    Read the article

  • Robots.txt Disallow command [on hold]

    - by Saahil Sinha
    How to disallow folders through Robots.txt, which are been crawled due to wrong url structure, which thus cause duplicate page error The URL been crawled as incorrectly by Google leading to duplicate page error: www.abc.com/forum/index.php?option=com_forum However, The actual correct pages however are: www.abc.com/index.php?option=com_forum Is this a correct way by excluding them through robots.txt: To exclude www.abc.com/forum/index.php?option=com_forum Below is command Disallow: /forum/ Will it not block in legitimate component folder 'Forum' of site?

    Read the article

  • redirecting in node.js behind mod_rewrite proxy

    - by chmanie
    I have a node.js application running behind an Apache mod_rewrite proxy configured in a .htaccess file like this: RewriteCond %{HTTP_HOST} =mydomain.com [OR] RewriteCond %{HTTP_HOST} =www.mydomain.com RewriteRule (.*) http://localhost:3000/$1 [QSA,P] When I now do a redirect (e.g. express' res.redirect()) inside my node.js application (which runs on port 3000), the user is always redirected to http://localhost:3000/ (which is in fact exactly what is defined above but not the desired behaviour). Is there any way around this?

    Read the article

  • No date/time shown before my page in Google search results

    - by Ruut
    I know that by changing the meta description of my webpage, I can control the texts shown by Google in the search results. However I do not know how I can control the text shown just before the search results, for example the date when the page was last updated. Which meta tag to use to accomplish this? UPDATE: My webpage is automatically updated on a weekly basis on irregular intervals by a cronjob which makes changes to the MySQL database which holds the content of my webpages. So the question is what (meta) info to add to my page.

    Read the article

  • What Ranking Factors Are Used For International Search?

    - by Itai
    Google.com vs Google.ca vs Google.co.uk (etc) all rank their results differently. The intention is to return more locally-relevant content. What factors, other than the ones below, are used to determine local relevancy? I already know the TLD (.com, .ca, etc) and likely the server IP address is used but there has to be more as this would not explain some search results I noticed this week. Particularly, I see a US-based site ranking #3 for some keywords on Google.com, ranking #5 on Google.ca and not ranking within the first pages on Google.co.uk. On Google.com it outranks a Australian site which outranks it on Google.ca. The site itself is relevant for all English-speaking locations and it being outranked by sites from different regions on different Google TLDs (but not ones from the same region as the TLD).

    Read the article

  • Solution for payment gateway with multiple sellers

    - by pvieira
    I'm looking for a payment gateway that can be used in a website with multiple sellers. Let's say that depending on the purchased item, a given seller/merchant should receive the money. Would that be possible using only one "master merchant" account that would act as a "distributor" of funds for several "sub-merchants"? Does any well established privider (paypal, worldpay, auth.net, etc) supports this?

    Read the article

  • Spam link text when searching for company directors' name

    - by Alex
    It was brought to my attention that if you search for the name of one of our directors (with the intent to find there profile page on our site) They come up as the first link in most search engines as you would expect but the link text is just pure spam. the three search string I have tested on Google, Bing, Ask, and Yahoo have all returned similar results. Here is a list of the search strings: Paolo rossi futex Mark rossi futex Marco rossi futex Dan Goldberg futex Any idea what might be causing this I have searched through as much of the sites code as I can and cant find anything wrong with it.

    Read the article

  • Open Source Bulletin Board with Facebook Group Integration

    - by Brian
    I'm working on a an open-source community-oriented project which needs a highly social component where users can post discussion topics and questions and interact with each other. It would be ideal to facilitate discussion seamlessly between a bulletin board and Facebook. Has anyone seen such an integration? I'm talking about something that goes beyond a simple FB OAuth and actually synchronizes both forum posts / topics / OAuth / comments. Pretty please if a moderator is going to delete this tell me which StackExchange forum is the appropriate place for posting such an inquiry. :)

    Read the article

  • Duplicate content appearing for multi lingual sites

    - by Rocky Singh
    I have a site which has a default url say "http://www.blahblah.com/" (which is default in english language). In my site there is support for multi languages. I am having few links at my home page say "English" "French" "Spanish" etc. and on clicking these links user is redirected to these links: http://www.blahblah.com/en-us/ (English) http://www.blahblah.com/fr-ca/ (French) http://www.blahblah.com/spanish-culture/ (Spanish) and based on culture in the url I am showing the content accordingly to end users in their desired language. Now, this was how my site is. The issue I am getting is with SEO. I noticed Google is considering (I checked via Google web masters) my site pages as duplicate like: 1. http://www.blahblah.com/documents/ and http://www.blahblah.com/en-us/documents/ 2. http://www.blahblah.com/news/ and http://www.blahblah.com/en-us/news and similarly all the pages are considered as a duplicate content in Google webmasters tools. I am worried of this, since I think my site is getting penalized in ranking because of this. Could you drop some idea how to overcome this situation?

    Read the article

  • How to open the JavaScript console in different browsers?

    - by Šime Vidas
    Updated on October 7th 2012 Chrome: Press either CTRL + SHIFT + J to open the "Console" tab of the Developer Tools. Alternative method: Press either CTRL + SHIFT + I or F12 to open the Developer Tools. Press ESC (or click on "Show console" in the bottom right corner) to slide the console up. Note: In Chrome's dev tools, there is a "Console" tab. However, a smaller "slide-up" console can be opened while any of the other tabs is active. Safari: Press CTRL + ALT + I to open the Web Inspector. See Chrome's step 2. (Chrome and Safari have pretty much identical dev tools.) Note: Step 1 only works if the "Show Develop menu in menu bar" check box in the Advanced tab of the Preferences menu is checked! IE9: Press F12 to open the developer tools. Click the "Console" tab. Firefox: Press CTRL + SHIFT + K to open the Web console. or, if Firebug is installed (recommended): Press F12 to open Firebug. Click on the "Console" tab. Opera: Press CTRL + SHIFT + I to open Dragonfly. Click on the "Console" tab.

    Read the article

  • On what criteria should I evaluate domain registrars?

    - by jdotjdot89
    Though I've been a web developer for a fair amount of time, I am going for the first time to buy a few domain names. I have looked into the domains I'm going to buy and know that they're available, and I've been looking into which sellers to use. After doing a lot of research, the main ones I'm considering are 1&1, Namecheap, and Gandi. The problem is, when continuing to research, I'm not really sure what makes one domain seller distinct from another. I don't need much in the way of services--definitely not hosting, since I plan to use Heroku for that. I mainly only need the domain itself and DNS management, as well as possibly SSL certificates and WHOIS protection. Question: What makes one domain seller different from another? How can I go about evaluating which one is the best for me? Note: This question is not which domain seller is the best, but rather, what criteria can I use to evaluate them and rank one over another. I'm trying to find out what makes one domain seller different from another, since they all seem to be pretty similar to me right now.

    Read the article

  • building a sms web application [closed]

    - by ramesh babu
    Possible Duplicate: How to add SMS text messaging functionality to my website? I would like to build a web application the purpose of the web site is to send and receive sms. I was researched so much but I didn't understood what are the requirements. simply i want he application similar to way2sms.com. and I dont want to buy sms from a company. I would like to build my own infrastructure. I have web designing stuff. I would like to know what are the requirements to send recieve sms and what is the infrastructure do i need to do it.

    Read the article

  • URL masking with .htaccess

    - by Michael Nguyen
    I need a hardcore programmer to help me with URL masking. So this is my situation. My website www.michaelfotograf.dk/blog/ is the main site that needs to be configured. I have another website/webhotel called www.umagepar.dk www.umagepar.dk is redirected to www.michaelfotograf.dk/blog/ The blog is an ongoing project where i post a lot of stuff to get my ranking higher on google. www.umagepar.dk which redirect to www.michaelfotograf.dk/blog/ is also a project on is own, so I do not want people to know, that it is a blog connected with www.michaelfotograf.dk/blog/ I therefore need to mask www.michaelfotograf.dk/blog/ so it will be called www.umagepar.dk in the URL in the searchbar at all times! What I need is a programmer that can do this for me. Name your price, and I'll see if I can affort it. Michael

    Read the article

  • How can I remove the security/malicious user warning from my website?

    - by BigBoy1337
    I have a domain name tradespring.net, and www.tradespring.net that redirect to my heroku app with a CNAME record. However when I first try to access these sites it gives me a malicious warning This is probably not the site you are looking for! blah blah blah then "proceed anyways" or "back to safety" Its because my browser realizes that it is redirecting. How can I make sure anyones browser (not just my browser) trusts this site and my heroku app? I dont think i need an SSL certificate because this site is not sending sensitive info (credit card info, ect.).

    Read the article

  • How to avoid email reply from my web site being marked as spam? [closed]

    - by Eric
    Possible Duplicate: How could I prevent my mail from being recognized as spam? Here's the situation: Customer fills out inquiry form on web site That inquiry goes to person X Person X goes to my web site (mysite.com) and presses some keys and the customer gets an email from [email protected] Here's my question: how can I be sure the email from [email protected] always gets through to the customer? Can I help it along by using SPF or some other secure email framework/solution? Thank you-- E

    Read the article

  • Where does the URL parameter "?chocaid=397" come from?

    - by unor
    In Google Webmaster Tools, I noticed that my front page was indexed two times: example.com/ example.com/?chocaid=397 I know that I could fix this with the use of link type canonical, but I wonder: Where does this parameter come from? There are various sites that have pages indexed with this very parameter/value: https://duckduckgo.com/?q=chocaid%3D397. I looked for similarities between these sites. but couldn't find a conclusive one: It's often the front page, but not in every case. Some are NSFW, but not all. When one domains' URL has this parameter, often other subdomains of the same domain have it, too. Examples Wikipedia entry Microsoft Codeplex

    Read the article

  • How to handle possible duplicate content across multiple sites?

    - by ElHaix
    Let's say I have two sites that cover the same vertical/topic. one in the USA and one in Canada. Both sites have local-related content, which is obviously unique by location. However they will share common news or blog pages. How do I avoid getting hit with duplicate content on both sites for those news/blog pages? If the content is exactly the same, I'm guessing I would have to pick which site's content I want to noindex,nofollow, is that correct, and if so, is that all I have to add on the URL links to those pages, and the pages' meta tags?

    Read the article

< Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >