Search Results

Search found 24209 results on 969 pages for 'site'.

Page 271/969 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • "Couldn't resolve host" for any external content

    - by scatteredbomb
    On our site we run a few different scripts for various sites (uploading to amazon s3, data from chartbeat, script to count twitter followers) and all of them just stop working from time to time. They work most days, but then some days (like today) they all just stop working. This simple script to get follower count into PHP $url = "http://twitter.com/users/show/username"; $response = file_get_contents ( $url ); $t_profile = new SimpleXMLElement ( $response ); $count = $t_profile->followers_count; Just sits there for a couple minutes, then finally spits out an error that says "Couldn't resolve host". Any script we use for an external site gives us this error. I'm not really sure where to check what's blocking these connections all of a sudden, and why it seems to work most times, then doesn't for a day or so, then works again. Any tips? Update: Contents of resolv.conf search 147.225.210.rdns.ubiquityservers.com nameserver 72.37.224.5 nameserver 72.37.224.6

    Read the article

  • CNTLM issue with intranet (maybe DNS)

    - by htorque
    On my Linux box I need to use an ISA proxy that requires authentication to reach the internet. I therefore installed CNTLM and configured it to point to the proxy address and listen on port 4321. I then configured my GNOME distribution to use localhost:4321 as global proxy for HTTP and HTTPS. The result: I can connect to the internet. I can ping intranet IPs, I do receive name resolution for intranet sites, yet I cannot ping them or open any intranet site in a browser (configured to use the distributions proxy) unless I use the site's IP address. I tried blocking the intranet IP range in the CNTLM config file without luck.

    Read the article

  • Copy wrongs and Copyright

    - by Tony Davis
    Recently, a Chinese blog website copied, wholesale and without permission, a Simple-Talk article on troubleshooting locking and blocking. Our initial reaction was exasperation and anger, tempered slightly by the fact that there was, at the top, a clear link to the original, and the book from which it was extracted. On the day the copy was posted, our original article saw a 30K spike in visits, so the site clearly has a substantial following! This made us pause for thought. Indeed, we wondered whether it might not be more profitable, and certainly more enjoyable, to notify the offender of similar content and serve a "put up" notice, rather than the usual DMCA "take down" . The DMCA request, issued to protect our and our authors' assets, is a necessary but tiresome, chore. So often, simple communication and negotiation could have averted the need for it. We are, after all, in the business of presenting knowledge, information and help to the SQL Server Community. If only they had asked! Of course, one's attitude changes according to the motivation behind the copying of content. One of the motivations seems to be pure vanity; they do it to try to enhance their CV, or their company's expertise, by pretending to expertise they don't possess. There is a class of plagiariser, however, that is doing it purely for money, getting advertising revenue by attracting hapless readers to their site. Not content with stealing content, sites can invest in services that provide 'load-testing' for websites that is so realistic that even the search engines can be fooled. Stolen content, fake visitors, swindled advertisers. Zero-tolerance is really the only way of dealing with plagiarism, and action will only be completely effective once Bing, Google, and the other search engines strike out from their listings the rogue sites that refuse to take down plagiarised content. It is, after all in everyone else's interests. Cheers, Tony.

    Read the article

  • Someone using my website for Email and significant increase in spam

    - by Joy
    Let me give you the background in context so that you know the full story. Last summer my web guy (he put my website together) got in a fight with someone who attempted to register on my site using the name of my company as part of his user name. I was not aware of this at all until it had escalated dramatically. I don't know why my web guy was so unprofessional in his response to this person. I really don't know him - met him via SCORE and have never met in person. He is a vendor. Anyway, this guy who got into it with my web guy then threatened to do all he could to hurt my business and said he was internet savvy, etc. So, nothing seemed to happen for a while then not long ago this guy attempted to send me a friend request on Linkedin. After his behavior I declined it. Shortly afterwards I began seeing a dramatic increase in spammers posting comments on the blog part of my site. Just lately I have been receiving Emails from a variety of names but all with the "@___" that I own - for my business. I had additional security added so now they have to register in order to comment on my blog and I am seeing a lot of registration attempts from the same (and similar) IP addresses with bogus names and weird Email addresses being blocked. So, it is not creating more work as it is all automatic. The Email addresses are of more concern. Is there a way to identify a person through an IP address or a place to report the behavior or the Email usage? This guy lives in South Carolina so he is not overseas. Any help/advice you can provide will be greatly appreciated. Thanks Joy

    Read the article

  • Is that why the website is called serverfault <g>

    - by bmullan
    Couldn't help it... just trying to be funny... when trying to save my profile I got the message from your website see below. Or is that an "initiation" ha! Anyway... only read a couple threads but good so far and I hope to read more. Brian = = = = = = = = = = = = = = = = = = = = = = = = = = = We apologize for any inconvenience, but an unexpected error occurred while you were browsing our site. It's not you, it's us. This is our fault. Detailed information about this error has automatically been recorded and we have been notified. Yes, we do look at every error. We even try to fix some of them. It's not strictly necessary, but if you'd like to give us additional information about this error, do so at our feedback site, meta.stackoverflow.com.

    Read the article

  • IIS7, different ports for websites but no portnumber in the browser

    - by Queensheep
    I have a windows server 2008 with IIS7 with 4 websites. In DNS I have 4 different URLs which point to the IP of the server. I configured each web site with the site bindings: website1: hostname: url1, port: 80, IP-Adresse: the adress of the server website2: hostname: url2, port: 80, IP-Adresse: the adress of the server The result is, that from the client, I can browse with all the 4 URLs to the specified web sites and everything is fine. Then I changed in IIS the port of the websites, so that website1 now uses port 8080, website2 uses port 8081, and so on. Now I have to use the browser with the url and the portnumber (like URL:8080). Is there a possibility, to configured the websites with different portnumbers but not to use the portnumbers in the browser?

    Read the article

  • Setup asp.net mvc application as subdomain website

    - by a_m0d
    I'm trying to setup a local application on a subdomain on our company server. There is already an installation of sharepoint running on http://companyweb/, but I would like my application to run on http://orders.companyweb/. I tried creating a new website, leaving the IP address the same as it is for http://companyweb, and just changing the host header value to orders.companyweb. However, no matter where I try to access the site from (different computers around the network, including the server itself), I keep getting 404 errors. I then tried setting up a simple index.html and serving that up as the highest priority; however, I still got 404 errors. This makes me think that I have actually setup the site itself wrong. What should I change to be able to access this application correctly on all the local computers?

    Read the article

  • Is there any *good* HTML-mode for emacs?

    - by Carson Myers
    I love emacs, and I want to do my web-programming work in it, but I can't find a way to get it to edit HTML properly. I mean it's seriously awful. It will do HTML fine, but not PHP, javascript, etc. I tried getting html-helper-mode... I downloaded it, put it in /usr/local/share/emacs/site-lisp, and added it to my .emacs file: (autoload 'html-helper-mode "html-helper-mode" "Yay HTML" t) (setq auto-mode-alist (cons '("\\.html$" . html-helper-mode) auto-mode-alist)) copied and pasted from some site (I don't know elisp). it just, doesn't highlight anything at all. I tried downloading a whole bunch of modes and using some other mode to string them together, to no avail. Emacs is so great in every other way--why can't it do the simple task of editing web pages? I mean, it's a pretty standard thing to do for editors these days. So, does anyone know how to do this?

    Read the article

  • Planning for the Recovery

    - by john.orourke(at)oracle.com
    As we plan for 2011, there are many positive signs in the global economy, but also some lingering issues. Planning no longer is about extrapolating past performance and adjusting for growth. It is now about constantly testing the temperature of the water, formulating scenarios, assessing risk and assigning probabilities.  So how does one plan for recovery and improve forecast accuracy in such a volatile environment?  Here are some suggestions from a recent article I wrote, which was published in the December Financial Planning & Analysis (FP&A) newsletter from the AFP (Association of Financial Professionals): Increase the frequency of forecasting Get more line managers involved in the planning and forecasting process Re-consider what's being measured - i.e. key financial and operational metrics Incorporate risk and probability into forecasts Reduce reliance on spreadsheets - leverage packaged EPM applications To learn more about these best practices, check out the FP&A section of the AFP website and register to receive the FP&A newsletter.  AFP recently launched a new topic area focused on the FP&A function and items of interest to this group of finance professionals.  In addition to the FP&A quarterly newsletter, AFP will be publishing articles, running webinars and will have an FP&A track in their annual conference, which is in Boston next November.  Brian Kalish, AFP's Finance Lead, is hoping this initiative creates a valuable networking and information-sharing resource for FP&A professionals. Here's a link to the FP&A page on the AFP web site:  http://www.afponline.org/pub/res/topics/topics_fpa.html If you register on the site you can access and subscribe to the FP&A newsletter and other resources. Best of luck in your planning for 2011 and beyond!   

    Read the article

  • Am I correctly handling duplicate URLs for my homepage?

    - by Rob Goldstein
    I own a Job Search site named www.conservationjobboard.com and have a concern about how the domain is viewed by search engines. The issue is that when the site was first designed, the default page was left as default.php, but the homepage was actually JobBoard.php. To handle this, the default.php page performed a redirect to the JobBoard.php file when www.conservationjobboard.com/ was requested. The main problem resulted because the redirect was a temporary redirect causing search engines to index conservationjobboard.com/ and conservationjobboard.com/JobBoard.php as 2 separate pages. This has since been corrected to use the .htaccess file so that JobBoard.php is now the default file for the root directory eliminating the need for the redirect. Problem is that search engines still show both URL's in search results (one including JobBoard.php and one that ends with /). Another potential problem is that some of my early backlinks are to conservationjobboard.com/JobBoard.php while the rest are to conservationjobboard.com The 2 outstanding questions are as follows: 1. Is my domain still being penalized by search engines like Google for having duplicate homepage URL's? 2. Are all of the back links to my homepage being considered as the same now or is the total number of back links being split between the 2 different URL's? If you think there are still issues with how we have this set-up, I was wondering if you could give me advice on what we should do differently. Thanks.

    Read the article

  • Ignoring GET parameters in Varnish VCL

    - by JamesHarrison
    Okay: I've got a site set up which has some APIs we expose to developers, which are in the format /api/item.xml?type_ids=34,35,37&region_ids=1000002,1000003&key=SOMERANDOMALPHANUM In this URI, type_ids is always set, region_ids and key are optional. The important thing to note is that the key variable does not affect the content of the response. It is used for internal tracking of requests so we can identify people who make slow or otherwise unwanted requests. In Varnish, we have a VCL like this: if (req.http.host ~ "the-site-in-question.com") { if (req.url ~ "^/api/.+\.xml") { unset req.http.cookie; } } We just strip cookies out and let the backend do the rest as far as times are concerned (this is a hackaround since Rails/authlogic sends session cookies with API responses). At present though, any distinct developers are basically hitting different caches since &key=SOMEALPHANUM is considered as part of the Varnish hash for storage. This is obviously not a great solution and I'm trying to work out how to tell Varnish to ignore that part of the URI.

    Read the article

  • New META TAGS with positive effects for seo ranking in 2011 and beyond

    - by Sam
    Hi all, im trying to make an up to date chart of meta tags, for all of us, with their purposes, their use and their good (or bad) effects on search engines/being found. Also any body knows new/promising meta tags? I will add yours into my list so this chart is a result of live discussion and up to date. Also, it would be creative to invent your own useful meta, because we are the ones making the web, or aren't we? LEGEND P PURPOSE? What does this meta tag do in 2011, if anything N NECESSARY? Does every site really needs it or not? G GOOD wether it will have a good effect for your site to be found I INVENTED meta tag, who knows it will be accepted in a year! META "METANAME" = PURPOSE? - NECESSARY? - GOOD EFFECT? #### important meta "title" = P consice summary + teaser - N very - G extremely meta "description" = P description + teaser - N yes - G very meta "robots" = P if needed, to skip default dmoz/yahoodir listing - N no - G? #### new & promising! Thanks for input (John, ) meta "original-source" P url of whoever broke the news gets credits - N? - G? meta "syndication-source" P url for syndication of published news - N? - G? meta "canonical" P? - N? - G? #### seems obsolete meta "keywords" = P some keywords - N+G not for google but yahoo likes them meta "language" = P overrule guesswork by defining language - N no - G? meta "page-topic" = P topic/theme - N? - G? meta "abstract" = P short summary - N? - G? meta "copyright" = ? #### invented by me meta "audience" = P filteres audience: "+seniors, +parents, -children, -youth" meta "mood" = P specifies textual style: "discussion, informative, commercial, sexual, fictional, scientific, romantic, therapeutic, technical"

    Read the article

  • Can prefixing a dash reduce the search engine rating?

    - by LeoMaheo
    Hi anyone! If I prefix a dash to GUIDs in my URLs on my Web site, in this manner: example.com/some/folders/-35x2ne5r579n32/page-name Will my SEO rating be affected? Background: On my site, people can look up pages by GUID, and by path. For example, both example.com/forum/-3v32nirn32/eat-animals-without-friends and example.com/forum/eat-animals-without-friends could map to the same page. To indicate that 3v32nirn32 is a GUID and not a page name, I thought I could prefix a - and then my webapp would understand. But I wouldn't want my search engine rating to drop. And prefixing a dash in this manner seems weird, so perhaps Googlebot lowers my rating. Hence my question: Do you know if my search engine rating might drop? (Today or in the future?) (I could also e.g. prefix id-, so the URL becomes example.com/forum/id-3v32nirn32, but then people cannot create pages that start with the word "id".) (I think I don't want URLs like this one: example.com/id/some-guid.) Kind regards, Magnus

    Read the article

  • Open Source Security packages for Rails

    - by Edwin
    I'm currently creating a complete web application using Rails 3 to familiarize myself with its inner workings and to gain a better appreciation of a working web application's moving parts. (Plus, since I'm still working on my degree, I hope that it will give me a better idea of what's BS in my education requirements and which weaknesses/skills I should focus on.) The example application I'm working on is an ecommerce site, and I've already configured the backend, routes, controllers, and so on. As part of the application, I'd like to integrate a second layer of security on top of the one Rails already provides for user authentication. However, I've been unable to find any on Google, with the exception of OAuth - which, from my understanding, is meant to secure API calls. While I could roll my own secure authentication system, I'm only in my second year of college and recognize that A) I know little about security, and B) there are developers that know much more about security that are working on open-source projects. What are some actively developed open-source security packages or frameworks that can be easily added to Rails? Pros and cons are not necessary, as I can do the research myself. P.S. I'm not sure whether I posted this in the right SE site; please migrate to SO or Security if it is more appropriate there.

    Read the article

  • Need help setting up mail DNS records

    - by Dave
    Hi, We are hosting our web site on host monster, but want our email to continue to be hosted at the old site. Our domain points to the hostmonster DNS servers, but I can't figure out the right configuration for the remote email servers. We have one MX entry, which is priority: 0 domain: ourdomain.com And then we have these DNS entries ... name: mail.ourdomain.com ttl: 14400 class: IN type: A record: old.host.ip.address name: mail1.ourdomain.com ttl: 14400 class: IN type: A record: old.host.secondip.address Can someone tell me what I need to add/edit to get mail to correctly route to our old host? Thanks, - Dave

    Read the article

  • Varnish + Plesk : vhost broken

    - by Raphaël
    I have an e-commerce site with 300,000 products and 20,000 categories. It is slow and currently in production. I decided to install Varnish to speed up. The trouble is that during installation, I got a Guru Meditation. Since the site is in production, I am not allowed to leave this error more than a second, thinking to have made an enormous stupidity. I followed the following tutorial: http://www.euperia.com/linux/setting-up-varnish-with-apache-tutorial I'm sure I followed all without error. I say that there may be a specific configuration with plesk. Has anyone already installed Varnish on a ubuntu 11.04 server with plesk 10? Does anyone have a better resource? I know it is "very vague" as an error, but maybe some of you have had this problem. edit 24/11/2011 I continued to work on Varnish + Plesk ... but it still does not work. 1) I changed the port for apache in plesk General # mysql -uadmin -p`cat /etc/psa/.psa.shadow` -D psa -e'replace into misc (param, val) values ("http_port", 8008)' 1.1) I rebuild the server conf # /usr/local/psa/admin/bin/httpdmng --reconfigure-all 2) I changed the apache conf files (if those were not taking full plesk top) vim /etc/apache2/ports.conf NameVirtualHost *:8008 Listen 8008 2.1) I do the same with /etc/apache2/sites-enables/000-default 3) I changed the port of my vhost (a single server) vim /var/www/vhosts/MYDOMAIN.COM/conf/XXXXXXXXX.http.include Replace the port 80 by this I want. Rebuild the vhost conf /usr/local/psa/admin/sbin/websrvmng --reconfigure-vhost --vhost-name=<domain_name> with without www (See my issue in serverfault: Edit vhost port in plesk 10.3 ) 4) I installed varnish by following this tutorial : http://www.euperia.com/linux/setting-up-varnish-with-apache-tutorial 5) I restart apache 2 + varnish service apache2 restart service varnish restart When I go to my site, I come across a page of apache It works! This is the default web page for this server. The web server software is running but no content has been added, yet. Can somebody help me ? This means that my vhost does not point to the right place. Why? What to do? How?

    Read the article

  • Preventing Users From Accessing wp-admin

    - by Gary Pendergast
    If you have a WordPress site that you allow people to sign up for, you often don’t want them to be able to access wp-admin. It’s not that there are any security issues, you just want to ensure that your users are accessing your site in a predictable manner.To block non-admin users from getting into wp-admin, you just need to add the following code to your functions.php, or somewhere similar:add_action( 'init', 'blockusers_init' );   function blockusers_init() { if ( is_admin() && ! current_user_can( 'administrator' ) ) { wp_redirect( home_url() ); exit; } }Ta-da! Now, only administrator users can access wp-admin, everyone else will be re-directed to the homepage.

    Read the article

  • Get to Know a Candidate (15 of 25): Jerry White&ndash;Socialist Equality Party

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting. Information sourced for Wikipedia. White (born Jerome White) is an American politician and journalist, reporting for the World Socialist Web Site.  White's Presidential campaign keeps four core components: * International unity in the working class * Social equality * Opposition to imperialist militarism and assault on democratic rights * Opposition to the political subordination of the working class to the Democrats and Republicans The White-Scherrer ticket is currently undergoing a review by the Wisconsin election committee concerning the ballot listing of the party for the 2012 Presidential elections. White has visited Canada, Germany, and Sri Lanka to campaign for socialism and an international working class movement. The Socialist Equality Party (SEP) is a Trotskyist political party in the United States, one of several Socialist Equality Parties around the world affiliated to the International Committee of the Fourth International (ICFI). The ICFI publishes daily news articles, perspectives and commentaries on the World Socialist Web Site. The party held public conferences in 2009 and 2010. It led an inquiry into utility shutoffs in Detroit, Michigan earlier in 2010, after which it launched a Committee Against Utility Shutoffs. Recently it sent reporters to West Virginia to report on the Upper Big Branch Mine disaster and the way that Massey Energy has treated its workers. It also sent reporters to the Gulf Coast to report on the Deepwater Horizon oil spill. In addition, it has participated in elections with the aim of opposing the American occupation of Iraq and building a mass socialist party with an international perspective. Despite having been active for over a decade, the Socialist Equality Party held its founding congress in 2008, where it adopted a statement of principles and a historical document. White has Ballot Access in: CO, LA, WI Learn more about Jerry White and Socialist Equality Party on Wikipedia

    Read the article

  • Do or can robots cause considerable performance issues?

    - by Anicho
    So the question in the title is exactly what I am trying to find out. My case is: At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on our services website. Out setup: Lets say I have site www.mysite.co.uk this is a shop window to our online services which sit on www.mysiteonline.co.uk. When people search in google for mysite they see mysiteonline.co.uk as well as mysite.co.uk. Cases against stopping bots crawling: We don't store gb's of data publicly available on the web Most friendly bots, if they were to cause issues would have done so already In our instance the bots can't crawl the site because it requires username & password Stopping bots with robot .txt causes an issue with seo (ref.1) If it was a malicious bot, it would ignore robot.txt or meta tags anyway Ref 1. If we were to block mysiteonline.co.uk from having robots crawl this will affect seo rankings and make it inconvenient for users who actively search for mysite to find mysiteonline. Which we can prove is the case for a good portion of our users.

    Read the article

  • Recommendation using Client side performance monitoring (boomerang/jiffy/episodes)

    - by Yasei No Umi
    There are a few Client-side JavaScript libraries that check web-site performance on the client side: Jiffy (http://code.google.com/p/jiffy-web/) Episodes (http://stevesouders.com/episodes/) by Steve Sounders Boomerang (http://yahoo.github.com/boomerang/doc/) by Yahoo! Have you used any of them or a similar too? What did you use for the server-side? for reporting? Is this a recommended approach? If not, how should I monitor my web-site performance from the end-user's view?

    Read the article

  • Plesk file permissions - Apache/PHP conflicting with user accounts.

    - by hfidgen
    Hiya, I'm building a Drupal site which performs various automatic disk operations using the apache user (id=40). The problem is that the site was set up on a subdomain belonging to user ID 10001 (ie my main FTP account) so the filesystem belongs to that user ID. So I keep getting errors like this: warning: move_uploaded_file() [function.move-uploaded-file]: SAFE MODE Restriction in effect. The script whose uid is 10001 is not allowed to access /var/www/vhosts/domain.com/httpdocs/sites/default/files/images/user owned by uid 48 in /var/www/vhosts/domain.com/httpdocs/includes/file.inc on line 579. I've tried changing the apache group in httpd.conf to apache:psacln, psacln being the default group for all web users but that's not helped. The situation now is: ..../files/images/ = 777 and chown = ftplogin:psacln ..../files/images/user = 775 and chown = apache:psacln ..../files/tmp = 777 and chown = ftplogin:psacln So apparently uid 40 and 10001 both have permissions to write to any of the 3 directories involved, but still can't. Am i missing something here? Can anyone help? Thanks!

    Read the article

  • What's the best approach to Facebook integration?

    - by Jay Stevens
    I have a new site/app going live next week (or somewhere close). I know there will be a relatively small (15,000?) very dedicated group of people on Facebook who will be very likely to be interested in the site, so I know I need Facebook integration of some kind. I won't be doing Facebook logins or pulling/posting to profiles yet, but I plan to... The question: Do I just do a Facebook "Page" for now? This is faster/easier to set up and seems a little less buggy.. and then migrate to a Facebook App later? or Do I create a "Facebook App" (with the api key/id/secret, etc.) now even if I'm doing nothing but using the "like" button. This means I don't have any migration later and I can use the javascript api to log "like" button clicks to Google Analytics, etc. Thoughts? Experiences? Is there a migration process to move your old Page users to your new "App"? What's the advantages / disadvantages of each.

    Read the article

  • Basics of Hosting [closed]

    - by Bala
    Assume we know nothing about web hosting but need to get a site online. What questions do we need to ask potential web hosting companies? What are the pitfalls and places where things can go terribly wrong? Are there any general good or bad things to be on the lookout for? Site could be anything from basic HTML up to e-commerce. We're looking for general thoughts that could apply to any web hosting. Thanks!

    Read the article

  • TFS and Sharepoint integration

    - by pho3nix
    I using a Sharepoint 2007 and TFS 2010. I installed all with reports sharepoint integration but when i try create a TeamProject Collections always return this warning: Server was unable to process request. --- TF250029: No user was found within the context of a Web site. Verify that the site does not allow anonymous access. And in sharepoint web applications tfs console window when check my sharepoint portal /sites return a error saying is blocked by firewall or server extensions is not installed, but i have all ok. Anyone can helpme, please.

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >