Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 212/1133 | < Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >

  • Can IP address transfer from person to another after he disconnects from ISP or any other way?

    - by learner
    I have been checking this website that sells a product (health related) and trying to find out if it is a scam site. The site is something.blogspot.in (and not something.blogspot.com, which happens to be a different site altogether). So is it an Indian site? It has a CBox chat box where the owner communicates with customers (or potential ones) for information. The owner shows that his product has worked for people by providing links from a forum (created by him at network54.com) where people have posted positively. One doesn't have to be registered to post on there, but the IP address of the poster gets shown along with the post. According to the owner, IP address is basis of authenticity. I found that many people had different IP addresses on their different posts. The owner has declared the nationalities of the people who posted. When I traced the IP addresses of them with this site, I found that the nationalities provided by the owner were wrong. Is it possible that when a person disconnects himself from an ISP, another person from another country gets his old IP address?

    Read the article

  • Disaster Recovery Example

    Previously, I use to work for a small internet company that sells dental plans online. Our primary focus concerning disaster prevention and recovery is on our corporate website and private intranet site. We had a multiphase disaster recovery plan that includes data redundancy, load balancing, and off-site monitoring. Data redundancy is a key aspect of our disaster recovery plan. The first phase of this is to replicate our data to multiple database servers and schedule daily backups of the databases that are stored off site. The next phase is the file replication of data amongst our web servers that are also backed up daily by our collocation. In addition to the files located on the server, files are also stored locally on development machines, and again backed up using version control software. Load balancing is another key aspect of our disaster recovery plan. Load balancing offers many benefits for our system, better performance, load distribution and increased availability. With our servers behind a load balancer our system has the ability to accept multiple requests simultaneously because the load is split between multiple servers. Plus if one server is slow or experiencing a failure the traffic is diverted amongst the other servers connected to the load balancer allowing the server to get back online. The final key to our disaster recovery plan is off-site monitoring that notifies all IT staff of any outages or errors on the main website encountered by the monitor. Messages are sent by email, voicemail, and SMS. According to Disasterrecovery.org, disaster recovery planning is the way companies successfully manage crises with minimal cost and effort and maximum speed compared to others that are forced to make decision out of desperation when disasters occur. In addition Sun Guard stated in 2009 that the first step in disaster recovery planning is to analyze company risks and factor in fixed costs for things like hardware, software, staffing and utilities, as well as indirect costs, such as floor space, power protection, physical and information security, and management. Also availability requirements need to be determined per application and system as well as the strategies for recovery.

    Read the article

  • SEO with duplicate content

    - by user16831
    I have a nature photography site with multiple types of photo galleries. Each photo and associated caption on my site appears in several galleries. For instance, a photo of a goldfinch that was taken on a trip to New Mexico in 2008 will appear in the "goldfinch.php" gallery, in the "finches.php" gallery, and in the "New_Mexico_2008.php" gallery. This duplication is useful for my site visitors - User A may want to see goldfinch photos, whereas User B wants to see photos from New Mexico - but I am concerned about the SEO implications. The typical suggestions to deal with duplicate content, such as 301 redirects and canonical tags, probably won't work in this case, because the page content is substantially different (ranging from ~1% to ~90% duplication, depending on the specific example chosen). The obvious solution to me would be to edit robots.txt to only allow search engines to crawl one type of gallery - for instance, if they crawled only the galleries organized by species(e.g. goldfinch.php), all the photos on my site would be found exactly once. However, the Google content guidelines recommend against blocking crawler access to duplicate information. Should I go ahead and use robots.txt anyway? Or is there a better solution?

    Read the article

  • using web proxies - safe to enter passwords?

    - by bergin
    Hi Wanted to check something on a local site and see how the outside world sees it. however, using a web proxy im not sure that when i enter my credentials the proxy wont record this and give the proxy owner access to my site. is there another way to see my own site as though I was on the other side?

    Read the article

  • Google Analytics unexplained spike

    - by Dianne
    My client's Google Analytics has had a spike everyday from May 6th (from 0 - 100.) This is in a city that he is not optimized for and does very little business in. The hits are coming in direct to the website. My client is concerned that it has something to do with competition using his site as a price shopping device. I can't view the ip to see where they are coming from and his site is not built in PHP so the work around doesn't work here. Any thoughts? Could it be a "referring site" situation and if so is there a way for me to find out what the referring site is?

    Read the article

  • Loading wordpress admin page prevents and blocks further requests

    - by Auxiliary
    I've installed a Wordpress site on a host (actually I moved it from localhost). The site loads and works perfectly, however if I log in with Admin, the admin page doesn't completely load and all further requests to the server are blocked for about 10 or 15 minutes. I can't even ping to the site. Where could this problem be from? Is it firewall related or...? Any help or idea is greatly appreciated.

    Read the article

  • 'Buy the app' landing page implementations: redirect or javascript popup?

    - by benwad
    My site (using Django) has an app that I'm trying to push - I currently have a piece of middleware that redirects the user to a page advertising the app if they're accessing the page on the iPhone, then setting a cookie so that the user isn't bugged by the message every time they visit the site. This works fine, however checking the page with the mobile Googlebot checker shows that the Googlebot gets stuck in the redirect (since it doesn't store cookies) and therefore won't index the proper content. So, I'm trying to think of an alternative implementation that won't hurt the site's Google ranking and won't have any other adverse effects. I've considered a couple of options: Redirect (the current solution), but don't redirect if the user agent matches the Googlebot's UA string. This would be ideal, however I'm not sure if Google like their bot being treated differently from other users, and I'm afraid the site's ranking may be somehow penalised if I go ahead with this. Use a Javascript popup instead of a redirect. This would make sure the Googlebot finds the content it needs, however I envision this approach causing compatibility issues with the myriad mobile devices/browsers out there, and may affect the page load time. How valid are these options? And is there a better option for implementing this feature out there? I've tried researching this topic but surprisingly can't find any reputable-looking blog posts that explore this topic.

    Read the article

  • TF2010 Build Definition and Access to Path is Denied error?

    - by Daniel DiVita
    I am new to TFS with regards to build definitions. I have a a build folder setup where I have set the permissons so EVERYONE has full control. Here is the exact error I am getting: E:\Builds\PIMSite\PIM.Site\PIM_Site.metaproj: Unable to copy file "C:\Builds\1\PIM System\PIM Site Build\Binaries\HtmlAgilityPack.xml" to "..\PIM.Site\Bin\HtmlAgilityPack.xml". Access to the path '..\PIM.Site\Bin\HtmlAgilityPack.xml' is denied. I have tried everyhitng. I have removed everything from that folder adn can delete it just fine so it is not being used by another process. Any thoughts?

    Read the article

  • <authentication mode=“Windows”/>

    - by kareemsaad
    I had this error when I browse new web site that site sub from sub http://sharp.elarabygroup.com/ha/deault.aspx ha is new my web site Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 61: ASP.NET to identify an incoming user. Line 62: -- Line 63: Line 64: section enables configuration Line 65: of what to do if/when an unhandled error occurs I note other web site as sub from sub hadn't web.config is that realted with web.config and all subs and domain has one web.config

    Read the article

  • Why are my backlinks not showing on google on this asp.net website with all I've done?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • OpenWeb(String) method

    - by ybbest
    I guess this is a SharePoint beginner problem ,however it took me a while to figure out what the problem is and I will blog it to help me to remember. Basically I wrote the following code to grab some list item from my SharePoint subsite http://win-oirj50igics/RestAPI,however I got the error stating that : “<nativehr>0×80070002</nativehr><nativestack></nativestack>There is no Web named / http://win-oirj50igics/RestAPI”. The problem is that OpenWeb(String) method returns the web site that is located at the specified server-relative or site-relative URL. It is the relative URL , so after I changed http://win-oirj50igics/RestAPI to RestAPI, everything works fine. using (SPSite site = new SPSite(http://win-oirj50igics/)) { SPWeb web = site.OpenWeb("http://win-oirj50igics/RestAPI"); SPQuery query = new SPQuery(); query.Query = camlDocument.InnerXml; SPListItemCollection items = web.Lists["Songs"].GetItems(query); IEnumerable<Song> sortedItems = from item in items.OfType<SPListItem>() orderby item.Title select new Song {SongName = item.Title, SongID = item.ID}; songs.AddRange(sortedItems); }

    Read the article

  • Sneak Preview - New CodePlex UI

    We have been busy the last several months working to improve the overall experience for the CodePlex community. We have been working through some of the top requested items, such as our big announcement last week enabling Git. Something that is not explicitly on the feature request list are requests to update the web site look and user experience.  As Brian Harry mentioned, the Future of CodePlex is Bright, so it is time to start brightening up the place. Goals As with any sizeable change you need to decide the scope of changes you want to tackle. We decided that we would optimize on incremental improvements verses taking months to get a completely new experience released. Our goals with this user experience work is to refresh the look and feel of the site, introduce new visual elements and set up the site for future structural changes. So this is not the end, just the beginning. Early Views I want to set a few expectations first, these screen shots are not final, and we are still working through the content and final element placement. Feedback is always welcome, just take that in mind as you review the images. New CodePlex Home The navigation changed a good bit on the home page and we have moved the search to a more consistent location across the site.   User Profile Users Home Page The goal was to make it easier to find and take action on common tasks such as creating projects. Project Home Issue Tracker   This should give you a taste of where we are going with the new user experience.     As always we love the feedback, either comment below, find us on Twitter @codeplex or @mgroves84, or create or vote up suggestions.

    Read the article

  • Is there a standard method for assigning nameservers to servers in a Windows domain?

    - by HopelessN00b
    As the title asks, I'm wondering if there's any standard or "best practice" for how to actually assign nameservers (DNS) and manage the nameserver configuration for client servers on a Windows domain. I'm talking about the setting circled in the below image, in case the language of the question is not clear enough: This is for a large, multi-site environment, where ideally/hopefully all servers point at their site's Domain Controller as the primary DNS server, and a DNS server at a different site as the secondary DNS server. For simplicity's sake, we can say that the secondary server would be the Domain Controller at the home site for everyone, and there are no tertiary DNS servers (even though that's not actually the case). Try as I might, I can't seem to find a GPO setting for this (at least on FL 2003 R2, the Computer Configuration - Administrative Templates - Network - DNS Client - DNS Servers GPO is Supported on: Windows XP Professional only), and I find it rather hard to believe that the "best"/"standard" solution would therefore be either scripting up something to apply the DNS settings per site, or using a DHCP server to push those configurations out to the other servers via the DHCP Scope Options. So, is there a standard way of managing this configuration? (That's hopefully not "a script" or "DHCP Scope Option.)

    Read the article

  • Could crosslinking using very general anchor texts be a reason for a drop in rankings?

    - by webmasters
    I have crosslinked 20 sites and I thought I have been penalized for this, asked this question and some experienced members told me maybe that crosslinking may not necessarily be the reason. The sites are on same host, different C class IP and every site in linked to each other. Each site targets long tail kewords. Site 1 - BMW Used Cars - and my area Site 2 - WW Used Cars - and my area And so on... When I crosslinked them (in the sidebar), I did it for the users; instead of repeating the terms used cars and my location over and over (since my users are targeted) I just crosslinked them using the brand: BMW, WW. Targeting locally, my niches are not overly competitive, so I did not need to many external links to rank on various positions on the 1st page. I'm thinking that when I chose to link using only the brand, google might have thought I wanted to actually rank for BBW and WW, hence the drop in my targeted local traffic. Could this be? I now have no-followed the links and I am noticing a slight recovery, but if it's not a interlinking penalty it would be a shame not to benefit from my links.

    Read the article

  • Setup IIS 7.5 with multiple website bindings and SSL?

    - by JK01
    On IIS 7.5 I am trying to achieve this with two websites: Default Web Site is bound to: (blank host header port 80 - http) (blank host header port 443 - https) go.example.com www71.example.com the IP address of go.example.com 2nd web site "Beta" is bound to: beta.example.com (blank host header port 443 - https) * using blank only because it doesn't seem to be possible to bind https to a named host header And both need to work with SSL. But I have these problems: When I type in beta.example.com, I see the go.example.com site instead I can not seem to add the SSL binding to both websites at once (I have a single *.example.com wildcard certificate). The beta site will not even start if I add the https binding to it. This is how I have set it up: What is the correct way to set it up?

    Read the article

  • Why does my webpage look different when I connect using different routers?! Does routers cache files?

    - by Ayyash
    Here is the case, I am working on a site from office and home, I recently updated the stylesheets and logged in the live site from office (using my same laptop I use all the time), and everything looks okay, I come home use my home internet connection to connect to the site using the SAME laptop, the styles are not updated! The thing is: this happens on ALL browsers, and after emptying the cache many times, and even after one month of work, and even if I have never opened the site before on that browser (as if my router has a cache of its own) Another thing: only one particular styles.css file seem to be hanging Extra info: I use the same IP for my home wireless router as that defined in the office, the usual 192.168.0.1

    Read the article

  • wildcard host name bindings for multiple subdomains in multiple sites on IIS7 with a single IP address

    - by orca
    Situation: I have a single windows 2008 server with a single public IP address. I have multiple domains with wildcard A records pointing to the single IP address. I need each domain to be hosted by a different web site. (i.e. www.domain1.com by site domain1site) I need domain1.com to act like www.domain1.com I need each site to be able to have multiple subdomains (i.e. www.domain1.com, abc.domain1.com, xyz.domain1.com) Not relevant yet here it goes, I plan to handle each subdomain by a different application hosted in the same site (i.e. application /xyz in domain1site) However I found out that IIS7 does not support creating web sites with wildcard host name binding and setting it without any subdomain (i.e. domain1.com) does not work, even for www.domain1.com. Is there a simple solution? Does any IIS Extension like Application Request Routing provide such capability?

    Read the article

  • Domain only responding to certain locations?

    - by CuriosityHosting
    I have a client who's been having problems with his site. The server doesn't seem to want to load hes site in certain countries, though other sites are fine. But this site [link removed] only seems to load in the US and Canada. In Europe, the UK, Asia etc, the site seems to be blocked (been like this for a week now). I've looked over the server and it seems fine. Other sites work fine, and the NS are set up properly, pointing to my main server, at http://puu.sh/MIGF Any ideas?

    Read the article

  • Getting rank for keywords that I don't want to appear on my website [duplicate]

    - by Rober
    This question already has an answer here: Which keyword should I use. colors or colours or a combination of both? 2 answers One of my products has two names. One of them is what I consider correct and thus it is what I want to appear on my website. The other name is incorrect for me, so I would like to avoid it. But I know that many people will search my product using the "bad" name. How could I get the "bad" name indexed for my site on search engines even if nobody can read it there? Of course, I want to do it "legally" so that no engine will ban my site considering it as cloaking, black hat SEO, etc... EDIT: Having that "bad" name on my backlinks is not an option. For example I would perceive user reviews connecting my site to that word as a negative point. Maybe having my site as a search result for that word could be negative as well, but I think it is worth it.

    Read the article

  • Google is re-indexing pages after redirecting URLs from HTTP to HTTPS incorrectly

    - by SLIM
    I upgraded my site so that all pages have gone from using HTTP to HTTPS. I didn't consider that Google treats HTTPS pages differently than HTTP. I recreated my sitemap to so that all links now reflect the new HTTPS URLs and let it be for a few days. (Whoops!) Google is now re-indexing all the HTTPS pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new HTTPS pages. The problem is that Google sees all of these as brand new pages when many of them have a long HTTP history. Of course most of you will recognize the problem, I didn't set up a 301 from the old HTTP to the new HTTPS URLs. Is it too late to do this? Should I switch my sitemap back to HTTP URLs and then 301 redirect to the new HTTPS URls? Or should I leave the sitemap as is, and setup 301 redirects anyway... I'm not even sure if Google is trying to reach the HTTP site anymore. Currently the site is doing 303 redirects (from HTTP to HTTPS), although I haven't figured out why yet.

    Read the article

  • Subdomain is preventing my search results from rising as expected in page rank

    - by culov
    My problem is that I have a site which has requires a dedicated page for every city I choose to support. Early on, I decided to use subdomains rather than a directly after my domain (ie i used la.truxmap.com rather than truxmap.com/la). I realize now that this was a major mistake because Google seems to treat la.truxmap.com as a completely different site as ny.truxmap.com. So for instance, if i search "la food truck map" my site will be near the top, however, if i search "nyc food truck map" im no where in sight because ny.truxmap.com wouldnt be very high in the page rank by itself, and it doesnt have the boost that it ought to be getting from the better known la.truxmap.com So a mistake I made a year ago is now haunting my page rank. I'd like to know what the most painless way of resolving my dilemma might be. I have received so much press at la.truxmap.com that I can't just kill the site, but could I re-direct all requests at la.truxmap.com to truxmap.com/la and do the same for all cities supported without trashing my current, satisfactory page rank results I'm getting from la.truxmap.com ?? EDIT I left out some critical information. I am using Google Apps to manage my domain (that is, to add the subdomains) and Google App Engine to host my site. Thus, Google Apps provides a simple mechanism to mask truxmap.appspot.com (the app engine domain) as la.truxmap.com, but I don't see how I can mask it as truxmap.com/la. If I can get this done, then I can just 301 redirect la.truxmap.com to truxmap.com/la as suggested below. Thanks so much!

    Read the article

  • Apache ProxyPass/ProxyPassReverse to IIS

    - by Dana
    We have an ASP.NET web application which is mapped to a folder on an apache hosted php site using ProxyPass.ProxyPassReverse. A couple of problems being encountered. cookies are being lost which breaks the site navigation, this can be overcome by setting the asp app as cookieless. Forms authentication is used on the ASP site, this is also broken withe the proxypass in place, suspect this is cookie related also. ASP site works ok when run from a domain/ip address. Use of a separate domain / sub-domain is not an option duew to client requirements.

    Read the article

  • Under what circumstances might an IIS6 website be automatically deleted?

    - by E. Anderson
    Late last week my colleagues did some hardware maintenance on one of our vmWare esxi servers. One of the guests is a Windows Server 2003 Web Edition system that runs our low-traffic web sites. We discovered this morning that one of those websites was no longer working with what appeared to be an SSL error. After logging in, I found that the web site in question had been deleted from IIS! Is it possible for this to happen without a user actually going in and deleting that single web site? All of the other sites were fine. The files for the site in question had not been touched. I just re-created the web site, assigned the SSL cert, and everything was working again. When I logged in, I did see the 'Unexpected Shutdown' dialog.

    Read the article

  • PHP Page Stopped outputting content After Running "yum install php-devel" Command

    - by stwhite
    This error is bizarre but after running the "yum install php-devel" command (after a long day of trying to install Facedetect and OpenCV for face detection) my site stopped functioning. The site uses mysql and php. When you hit the url, the page executes the mysql and the php, but it appears to randomly stop outputting the content of the page. None of the code was changed and the site was working flawlessly prior to running the mentioned ssh command. I do use output buffering in the site, but after removing the calls "ob_flush", "ob_end_flush" and "ob_start" it didn't appear to help—still having issues with the site. Any ideas what this could be? Here is output from terminal: [myserver ~]# cd Facedetect-4b1dfe1 [myserver Facedetect-4b1dfe1]# phpize Configuring for: PHP Api Version: 20090626 Zend Module Api No: 20090626 Zend Extension Api No: 220090626 [myserver Facedetect-4b1dfe1]# configure bash: configure: command not found [myserver Facedetect-4b1dfe1]# phpize && configure && make && make install Configuring for: PHP Api Version: 20090626 Zend Module Api No: 20090626 Zend Extension Api No: 220090626 bash: configure: command not found bash: Read: command not found [myserver Facedetect-4b1dfe1]# make make: *** No targets specified and no makefile found. Stop. [myserver Facedetect-4b1dfe1]# yum install php5-devel

    Read the article

  • How can I change a specific website's style?

    - by Darthfett
    I have a specific website I often use (specifically, http://www.pygame.org/), which has an awful color scheme. I would like to change the color scheme of the site, but I haven't been able to find a good tool for the job. Some basic requirements: It should not be universal to all websites, or affect other websites. I want this to be semi-automatic. I don't want to have to re-define the theme for each page of the site. I want to continue to access the site online (I don't want a local copy of the entire site) Not OS-specific (browser-specific is okay) I am currently using Firefox, but I am also happy with Chrome. There may be some limitations on what is able to be done automatically, as the CSS seems to be embedded in the HTML (and some also in the HTML tags). I would like to remove as much of the green as possible. Is there an existing extension/add-on that does this?

    Read the article

< Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >