Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 14/1133 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Using 301 Redirects on new site when access to old site denied?

    - by Cape Cod Gunny
    I have a situation where I'm standing up a new website on a different web host. I've been denied access to the old site by the hosting company and the old site will most likely be turned off very soon. If my new site contains pages that are named slightly different how do I go about setting up 301 redirects on my new site? For example: www.oldsite.com\aboutus\ www.newsite.com\aboutus.html www.newsite.com\productx.html www.oldsite.com\productx\ Edit: Clarification: The old domain name is different from the new domain name. On my newsite do I just duplicate every page that existed on the old site and place redirect code inside those pages? What does the redirect code look like?

    Read the article

  • Duplicating content from another site and adding value (summaries, statistics) - ranking and courtesy

    - by Krastanov
    I am working on a site that takes a governmental data base, provides a number of statistical and other summaries and also post the original data. However this data (mostly long pieces of text) is also published on the official governmental site (without the added value of summaries). Should I worry about google ranking due to this duplication? What is the preferred way to point to the official source of the information? There is no advertisement on my site. My site is ".com". The governmental site is ".bg".

    Read the article

  • access to site owner not working

    - by Daveo
    I made a website it works for everyone except for the site owner, who cannot access the site. He gets a server could not be found error. I went to his house and the issue occurs for all of his devices (PC, laptop, iPhone using WiFi) when I turn WiFi off the iPhone it works. So I thought it was a problem on his router so I reset is router to default factory settings but it still does not work, I have flushed his DNS cache and his browser cache. But nothing seems to help. He cannot even PING the server using the domain name. However when he uses the IP address he can see the site fine. I looked in cpanel for an blocked IP addresses but cannot see anything. What could be stopping just him from accessing the site via domainname and not the IP address? Site details are: PHP, osCommerce it is hosted by site5 on my reseller account.

    Read the article

  • Site inaccessible by some people, fine for others [on hold]

    - by Paul Howell
    A couple of days ago my website www.howellphoto.com (hosted by one.com, wordpress site) started loading really slowly, and I have been unable to access any pages linked from the homepage. Several of my friends have found the same issue, yet many are able to access the site without problem. Live support at one.com have not been all that much help, requesting the ip addresses of a few people who cannot access the site, and saying it could be a firewall issue. Wordpress support (my site was created in prophotoblogs) have been better and have updated all plugins, etc, but can see no issue from their end. My main issue is that even if there was a local fix that I could do on my computer, this would not help wih any potential customers visiting my site for information! This is driving me crazy!!! Any help will be legendary! Cheers, Paul

    Read the article

  • Another website is mirroring and ranks above my site in search results

    - by Marlboro Goodluck
    There is a site of ill-repute known as thedirty which has completely mirrored my site and now has links appearing on Google at the #1 spot using my content. I checked my log files and noticed that this site has been crawling mine for sometime, and also has 10,000 links from their site to mine. I have blocked user access which is referred from this site and reported them as web spam to Google already. I also disavowed the domain. How are they getting top links in Google (even overtaking mine) for such nefarious tactics? What are the steps to completely eliminating an issue such as this?

    Read the article

  • Google not showing any pages from my site in the index after three months [on hold]

    - by Alex Coisman
    Despite having a sitemap and using Google Webmaster Tools, it has been over 3 months and my site has not been added to the Google index at all. Here's the site: www.famouslefthandedpeople.com As far as I know, I have done everything correctly. However, there must be something I am overlooking that is preventing Google from indexing the site. I do not have a robots.txt file, so allow/disallow isn't the issue. Although the content of the site is sparse, it is original and not duplicated internally or externally so Panda/Penguin should not be a problem. I have reviewed the answers at Why isn't my website in Google search results? and I don't think it applies here. If it matters, I am using WordPress to create the site. What other factors should I be looking at in order to troubleshoot this?

    Read the article

  • Another website is mirroring my site

    - by Marlboro Goodluck
    Question for you all. There is a site of ill repute known as thedirty which has completely mirrored my site and now has links appearing on Google at the #1 spot using my content. I checked my log file and noticed that this site has been crawling mine from sometime, and also has 10k links from their site to mine. I have blocked user access which is referred from this site and reported them as web spam to Google already. I also disavowed the domain. How are they getting top links in Google (even overtaking mine) for such nefarious tactics? What are the steps to completely eliminating an issue such as this?

    Read the article

  • Site gone from google due to dmca complain

    - by whatismyans
    My sites traffic gone null but after Google recieve one dmca complain for my site. My site is a mp3 search engine so I can index copyright mp3 out of my knowledge. I recieved a message in webmasters tools in and I have removed the copyright content from my site but traffic from Google is not increasing. What is the problem? Do I need to tell Google that I have removed the copyright content from my site? If yes then why I have lost my site traffic from Google forever?

    Read the article

  • Box Selection and Multi-Line Editing with VS 2010

    - by ScottGu
    This is the twenty-second in a series of blog posts I’m doing on the VS 2010 and .NET 4 release. I’ve already covered some of the code editor improvements in the VS 2010 release.  In particular, I’ve blogged about the Code Intellisense Improvements, new Code Searching and Navigating Features, HTML, ASP.NET and JavaScript Snippet Support, and improved JavaScript Intellisense.  Today’s blog post covers a small, but nice, editor improvement with VS 2010 – the ability to use “Box Selection” when performing multi-line editing.  This can eliminate keystrokes and enables some slick editing scenarios. [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] Box Selection Box selection is a feature that has been in Visual Studio for awhile (although not many people knew about it).  It allows you to select a rectangular region of text within the code editor by holding down the Alt key while selecting the text region with the mouse.  With VS 2008 you could then copy or delete the selected text. VS 2010 now enables several more capabilities with box selection including: Text Insertion: Typing with box selection now allows you to insert new text into every selected line Paste/Replace: You can now paste the contents of one box selection into another and have the content flow correctly Zero-Length Boxes: You can now make a vertical selection zero characters wide to create a multi-line insert point for new or copied text These capabilities can be very useful in a variety of scenarios.  Some example scenarios: change access modifiers (private->public), adding comments to multiple lines, setting fields, or grouping multiple statements together. Great 3 Minute Box-Selection Video Demo Brittany Behrens from the Visual Studio Editor Team has an excellent 3 minute video that shows off a few cool VS 2010 multi-line code editing scenarios with box selection:   Watch it to learn a few ways you can use this new box selection capability to optimize your typing in VS 2010 even further: Hope this helps, Scott P.S. You can learn more about the VS Editor by following the Visual Studio Team Blog or by following @VSEditor on Twitter.

    Read the article

  • UML Diagrams of Multi-Threaded Applications

    - by PersonalNexus
    For single-threaded applications I like to use class diagrams to get an overview of the architecture of that application. This type of diagram, however, hasn’t been very helpful when trying to understand heavily multi-threaded/concurrent applications, for instance because different instances of a class "live" on different threads (meaning accessing an instance is save only from the one thread it lives on). Consequently, associations between classes don’t necessarily mean that I can call methods on those objects, but instead I have to make that call on the target object's thread. Most literature I have dug up on the topic such as Designing Concurrent, Distributed, and Real-Time Applications with UML by Hassan Gomaa had some nice ideas, such as drawing thread boundaries into object diagrams, but overall seemed a bit too academic and wordy to be really useful. I don’t want to use these diagrams as a high-level view of the problem domain, but rather as a detailed description of my classes/objects, their interactions and the limitations due to thread-boundaries I mentioned above. I would therefore like to know: What types of diagrams have you found to be most helpful in understanding multi-threaded applications? Are there any extensions to classic UML that take into account the peculiarities of multi-threaded applications, e.g. through annotations illustrating that some objects might live in a certain thread while others have no thread-affinity; some fields of an object may be read from any thread, but written to only from one; some methods are synchronous and return a result while others are asynchronous that get requests queued up and return results for instance via a callback on a different thread.

    Read the article

  • Multi Threading - How to split the tasks

    - by Motig
    if I have a game engine with the basic 'game engine' components, what is the best way to 'split' the tasks with a multi-threaded approach? Assuming I have the standard components of: Rendering Physics Scripts Networking And a quad-core, I see two ways of multi-threading: Option A ('Vertical'): Using this approach I can allow one core for each component of the engine; e.g. one core for the Rendering task, one for the Physics, etc. Advantages: I do not need to worry about thread-safety within each component I can take advantage of special optimizations provided for single-threaded access (e.g. DirectX offers a flag that can be set to tell it that you will only use single-threading) Option B ('Horizontal'): Using this approach, each task may be split up into 1 <= n <= numCores threads, and executed simultaneously, one after the other. Advantages: Allows for work-sharing, i.e. each thread can take over work still remaining as the others are still processing I can take advantage of libraries that are designed for multi-threading (i.e. ... DirectX) I think, in retrospect, I would pick Option B, but I wanted to hear you guys' thoughts on the matter.

    Read the article

  • New site – and a special offer

    - by Red Gate Software BI Tools Team
    SSAS Compare has a brand new website! The old page was thrown together in the way that most Red Gate labs sites tend to be — as experimental sites for experimental products. We’ve been developing SSAS Compare for a while now, so we decided it was time for something a bit prettier. The new site is mostly the work of Andrew, our marketing manager, who has all sorts of opinions about websites. One of the opinions Andrew has is that his photo should be on every site on the internet, or at least every Red Gate site on the internet, and that’s why his handsome visage now appears on the SSAS Compare page. Well, that isn’t quite true. According to Andrew, people download more software when they have photos of human beings to look at. We want as many people to try SSAS Compare as possible, so we got the team together for an intimate photoshoot directed by Red Gate’s resident recorder of light, Dom Reed (aka Mr Flibble). The photo will appear on the site as soon as Dom is finished photoshopping us into something more palatable, which is a big job. Until then, you’ll have to put up with Andrew. We’ve also used the new site to announce a special offer. Right now, SSAS Compare is still a free beta, but by signing up to our Early Access Program, you’ll get a 20% discount when we release SSAS Compare as a fully-fledged product. We’ll use your email address to send you news and updates about business intelligence tools from Red Gate (and nothing else). If that sounds good to you, go to the SSAS Compare site to sign up. By the way, the BI Tools team wasn’t the only thing Dom photographed last week. Remember Noemi’s blog about the flamenco dance? We’ll be at SQL Saturday in our home town of Cambridge this Saturday (8th September), handing out flyers of a distinctly Mediterranean flavour. If you’re attending, be sure to say hello!

    Read the article

  • How to fix some damages from site hack?

    - by Towhid
    My site had been hacked. I found vulnerability,fixed it, and removed shell scripts. But hacker had uploaded thousands of web pages on my web server. after I removed those pages I got over 4 thousand "Not Found" Pages on my site(All linked from an external free domain and host which is removed now). Also hundreds of Keywords had been added to my site. after 3 weeks I can still see keywords from removed pages on my Google Webmaster Tools. I had 1st result on google search for certain keywords but now I am on 3rd page for the same keywords. 50% of my traffic was from google which is now reduced to 6%. How can I fix both those "Not Found" pages problem and new useless keywords? and Will it be enough to get me back on first result on google? P.S: 1)Both vulnerability and uploaded files are certainly removed. 2)My site is not infected, checked on google webmaster and a few other security web scan tools. 3) all files had been uploaded on one directory so i got something like site.com/hacked/page1.html and site.com/hacked/webpage2.html

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • How to correctly handle redirect after site facelift

    - by Stefan
    I recently updated our site taking it from a multi-page site to a single page site. The problem now is that when the site is searched in say Google, it displays the site as well as the indexed pages. So if a user clicks say our "About" page, it takes them to our now outdated material. I am hoping to get some guidance on how to properly handle this. I figure the first step is to now setup a robots.txt on our new index page to tell the engines not to crawl beyond index.php. But in the meantime, how do I handle the fact that when searching our site on Google we may still have users who try to click on sub-page links? Should I simply setup redirects while waiting for the engines to update? And if so, do I need to setup redirects on each page using PHP or is this something I would take care of on our sites control panel? I am not very familiar with redirects... Any help is appreciated!

    Read the article

  • Site experiencing low traffic volume between 8AM and 4PM BST

    - by BizNuge
    There may be no definitive answer to this question but I thought peer review of the problem might stimulate some ideas on the topic. We have a boutique sales site that is experiencing low volumes of traffic (both UK and international) between 8AM and 4PM BST. This seems sort of strange since our target audience for the site is UK based, and this would seem to be when people are awake and online. We are in contact with another boutique site in the same sector who don't experience this issue, so it seems kinda strange. Later on in the day we are getting traffic from the UK, as well as a fair amount of international traffic, so I'm at a loss to figure this one out. The site is fairly well optimised including:- sitemap.xml Proper caching policies across the board google merchant dublin core microdata html5 pretty urls meta and content are reviewed as an ongoing concern we have decent sitelinks for direct queries thru google on the site name a decent amount of inbound links FB, Twitter, Google +1 Google maps listing [verified] site has been selling for ~4 months and is getting ~250 users per day. So I'm not entirely sure how to explain the mid day dip in our figures.... Any ideas at all would be useful. Cheers all!

    Read the article

  • how did Google Analytics kill my site?

    - by user1813359
    Yesterday I created a google analytics profile for one of my sites and included the JS block in the layout template. What happened next was very strange. Within about 2 minutes, the site had become unreachable. I had been checking the AWStats page for the site when I thought to set up GA. After that had been done, I clicked on the link for 404 stats, which opens in a new tab. It churned for a long while and then showed a nearly blank page, similar to that when Firefox chokes on a badly-formatted XML page, except there was no error msg. But i was logged into the server and could see that that page has a 401 Transitional DTD. Strange! I tried viewing source but it just churned endlessly. I then tried "inspect element" and was able to see an error msg having to do with some internal Firefox lib. Unfortunately, i neglected to copy that. :-( All further attempts to load anything on the site would time out. Firebug's Net panel showed no request being made. Chrome would time out. So, I deleted the GA profile, removed the JS block, and cleared the server cache. No joy. I then removed all google cookies and disabled JS. Still nothing. No luck in any other browser. And now my client couldn't access the site. Terrific. I was able use wget while logged into another server. The retrieved page was fine, and did not contain the GA JS block. However, the two servers are on the same network. (Perhaps a clue.) The server itself was fine. Ping, traceroute looked great. I could SSH in. I tailed the access log and tried a browser request. Nothing. But i forgot to quit and a minute or so later I saw a request from someone else being logged. Later, I could see that requests had been served all day to some people. Now, 24 hours later, the site works once again, but is still unreachable by the client (who is in another city). So, does anyone have some insight into what's going on? Does this have something to do with google's CDN? I don't know very much about how GA works but what I'm seeing reminds me of DNS propagation issues. And why the initial XML error? And why the heck was the site just plain unreachable? What did google do to my site?! Sorry for the length but I wanted to cover everything.

    Read the article

  • Screen-scraping a site with a asp.net form login in C#?

    - by Ajit
    Hi Friends, i've created a web application in asp.net so far. where i've tried to get some data(site scraping) from secure page of a web site.I've used the HttpWebRequest class for this functionality but i haven't accessed the secure page yet. Every time the login pages was scraped not secure page.I have the site user id and password and don't know that which language site has been developed in. Please advice what should i do ?

    Read the article

  • IIS7 Handler Mapping Migration from Sites Config to Server Config [migrated]

    - by Danomite
    We have a bunch of sites running with about 8 handler mappings in their web.config files. Unfortunately, they were getting copied site to site every time a new one was added. Now the time has come for me to get these out of all the web.config's and get them into the server's Handler Mappings. If I add the mapping to the the server while it still exists in the web.config, IIS throws an error when you browse to the site. I have a few dozen web.config's to edit here with about 10 mappings in each. Is there a way to add these mappings to the server without having to go in an edit each web.config file manually? Otherwise, every site will be down for a few minutes while I go into each file and remove the handlers. Thanks!

    Read the article

  • Silverlight TV 19: Hidden Gems from MIX10, UFC's Multi-Touch App

    John ran into Silverlight MVP Ward Bell of IdeaBlade while at MIX10 (how could anyone miss him!). Ward was kind enough to sit and talk with John to show off the multi-touch application his company wrote for UFC using Silverlight. It uses multi-touch, Caliburn, MVVM, and, of course, Silverlight! Relevant links: John's Blog Ward's Blog Silverlight 4 RC Features (or download here) Follow us on Twitter @SilverlightTV Learn more about Silverlight with the new Silverlight Training Course...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • "Il faut repenser les OS pour les processeurs multi-coeurs", d'après Microsoft

    "Il faut repenser les OS pour les processeurs multi-coeurs", d'après Microsoft Dave Probert est expert du noyau chez Microsoft. Selon lui, l'approche actuelle du multi-coeur n'est pas encore à même d'en exploiter toute la puissance, et est trop "compliquée". Aussi, propose-t-il une autre organisation. Car, d'après lui, "la solution" ne se situerait pas dans l'amélioration de techniques "comme le parallel programming, mais plutôt dans le refonte des abstractions de base qui constituent le modèle du système d'exploitation". Il explique qu'on ne tire pas assez parti des performances offertes par les processeurs multicoeurs et qu'aujourd'hui, on ne devrait plus avoir à patienter de...

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >