Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 210/1133 | < Previous Page | 206 207 208 209 210 211 212 213 214 215 216 217  | Next Page >

  • I need a relatively cheap host, which will be able to handle sudden peaks in traffic?

    - by Morten K
    Hello, We're launching a product in a few months, which will obviously have a website. Judging from our current traffic, we believe that overall traffic will probably not be that much, but we are aiming at promoting the site heavily using social media. This has the typical problem, that IF we get suddenly get picked up by a large tech blog, we will see a sudden burst: A very heavy increase in traffic all of the sudden. If we use a cheap charlie host as our current host is (www.unoeuro.com) or something similar like GoDaddy, I'm afraid that the site will go down under the load. If that happens, then we might as well have thrown our social media marketing dollars out of the window. Our site will be relatively lightweight, all videos hosted at Youtube or Vimeo and other than that mainly just a standard webpage (ie nothing too heavy). I am hoping for recommendations for a good hosting company, which has some form of scalable hosting, so if / when a traffic surge hits, the site will not go down.

    Read the article

  • I have done everything correct on my asp.net website, re: SEO; why aren't google backlinks showing?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • In Chrome, is there a way to search only within my bookmarked sites?

    - by McGirl
    I'm not trying to search for a bookmark - I'm trying to search for a term that may appear on one of my bookmarked sites. For instance, if I wanted to search on the term "Obama" and restrict my query to the New York Times web site, I know I could type this into the main URL/Search bar in Chrome: obama site:nytimes.com I'm wondering if there's a way to change that so that it's effectively: obama site:any one of the sites in my Chrome bookmark folder Thanks in advance for your help!

    Read the article

  • Multiple sites with the same codebase in Python

    - by Jimmy
    I am trying to run a large amount of sites which share about 90% of their code. They are simply designed to query an API and return the results. They will have a common userbase / database but will be configured slightly different and will have different CSS (perhaps even different templating). My initial idea was to run them as separate applications with a common library but I have read about the sites framework which would allow them to run from a single instance of Django which may help to reduce memory usage. https://docs.djangoproject.com/en/dev/ref/contrib/sites/ Is the site framework the right approach to a problem like this, and does it have real benefits over running separate applications? Initially I thought it was, but now I think otherwise. I have heard the following: Your SITE_ID is set in settings.py, so in order to have multiple sites, you need multiple settings.py configurations, which means multiple distinct processes/instances. You can of course share the code base between them, but each site will need a dedicated worker / WSGIDaemon to serve the site. This effectively removes any benefit of running multiple sites under one hood, if each site needs a UWSGI instance running. Alternative ideas of systems: https://github.com/iivvoo/django_layers https://github.com/shestera/django-multisite I don't know what route to be taking with this.

    Read the article

  • How does one set up API on a locally hosted server...

    - by L33tCh
    I am setting up a personal Wordpress site and want to be able to post to it from other sites... the common request being for my "API Key". When creating a site on Wordpress.com for example, and API key is sent to you by mail, but surely it should be relatively simple, (if not just an address on the local site to point to,) to have one for a personal server (ubuntu server)?

    Read the article

  • Seizing naming master from child domain server

    - by meera
    when I am trying to seize the role from my child domain server the naming master I get the following error fsmo maintenance: seize naming master Attempting safe transfer of domain naming FSMO before seizure. ldap_modify_sW error 0x34(52 (Unavailable). Ldap extended error message is 000020AF: SvcErr: DSID-03210380, problem 5002 (UN AVAILABLE), data 8438 Win32 error returned is 0x20af(The requested FSMO operation failed. The current FSMO holder could not be contacted.) ) Depending on the error code this may indicate a connection, ldap, or role transfer error. Transfer of domain naming FSMO failed, proceeding with seizure ... Server "win-fb20ixk90mu" knows about 5 roles Schema - CN=NTDS Settings,CN=WIN-3918XHC5STU,CN=Servers,CN=Default-First-Site-Na me,CN=Sites,CN=Configuration,DC=HCL,DC=com Naming Master - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First- Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com PDC - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com RID - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com Infrastructure - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First -Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com

    Read the article

  • Title of the page in search results and title of google's cached version are different. Why?

    - by Azmorf
    Check this: http://www.google.com/search?q=site:gunlawsbystate.com+kansas+gun+laws The title of the first result is "Kansas Gun Laws - Gun Laws By State". Although, on the page google has cached the title is different: <title>Kansas Gun Laws - Kansas Gun Law - Reciprocity Guide</title> Google shows the title that has been on the site 2-3 months ago. Google bot has visited the website a lot of times since that, and as you see it even cached it (the latest version is of 15th Sept), however for some reason it doesn't change the title to the new one in the search results. We use hash-bang URL structure on this website. It completely meets google's requirements for AJAX websites (_escaped_fragment_ stuff). The issue I explained is happening with almost all hash-bang pages that got indexed. Questions: Why does it keep old page title in the search results? Can it be connected to the fact that I'm using hash-bang URLs? There are lots of pages on the site that have the same issue, all of them have hash-bang URLs. Another thing I noticed is that Google's "Preview" feature doesn't work for any hash-bang URLs on the site. Did I do anything wrong? It has got cached versions of the pages, why wouldn't it generate a preview? Thanks (and sorry for my English) PS. Here's a weird thing I also noticed: this search query https://www.google.com/search?q=Kansas+Gun+Laws+-+Reciprocity+Guide shows the correct title for the same page as in the example above. Why does google show different titles for the same page when you run different queries?

    Read the article

  • Making internal website available publicly (Win 2008 Server)

    - by endigo
    I have an IIS 7 web site that is running on a Windows 2008 Server (64-bit) VMWare on a Windows 2008 Server (64-bit) Host on my local network. My router is a Firebox XEdge and it has port 80 directed to the IP of the server on VMWare. I can reach the web site from inside the network, but I cannot reach the site from outside the network. I have other web sites that are working through the Firebox, and I am confident that it is configured correctly. I suspect that Windows 2008 server is blocking routed or public addresses, but I have shut down the firewall on the Server that is running on VMWare and the AVG Anti-virus to no avail. How can I make my site available publicly.

    Read the article

  • Website not reachable via mobile device

    - by sanders
    I have this very strange problem with a website. The site works normal when coming from a webbrowser which is non-mobile. But when trying to reach the site via a mobile device (tried different browsers) it doesn't get to the website. The only error I see is: This website is not available and the message is: DNS_PROBE_POSSIBLE Does anyone know where I should look for the problem. The website is based on Joomla. The Url of the site is: http://bit.ly/1bOToII

    Read the article

  • How do I prevent spawning of zombie-like apache2 processes on Dreamhost VPS?

    - by Jonathan Hayward
    I have had a website for months or longer on a DreamHost VPS, and I have had vague memories on, in initial setup, having to turn off some customized Apache under /dh to get a standard Apache 2.x to work with. Things have been going along on an even keel, when I started making some changes lately and I found that when I tried to bounce Apache (/usr/sbin/apachectl restart), it couldn't bind to port 80, and my site had been converted from a big literature site to a small parking site. I tried to see what was listening on 80, and it was a DreamHost-customized Apache that had spawned. I killed all of them, restarted Apache, and changed the parent directory under /dh to mode 000. That was a day or two ago. I was bouncing Apache again in trying to get a new site to load under HTTPS, and I found that once again DreamHost's apache had spawned, from the directory I set to mode 000, and once again converted my site to a parking page. I renamed the directory, but I am very skeptical of whether I have permanently killed the DreamHost-customized Apache. Besides duct tape options like a crontab to kill and delete each minute, how can I permanently turn off the Apache processes that are spawning from a location under /dh and interfering with standard Apache? What should I be doing that I am not? Can DreamHost's technical support stop the interference? Thanks,

    Read the article

  • search engine (bing) lost my www

    - by Jason
    I just found that my site in the result of bing was broken, casue bing display wrong domain name: my site domain name: www.mysite.com; bing list my site domain name : mysite.com How can i ask bing to change it to the right one? Another search engines list it correctly.

    Read the article

  • Google affecting my SERP Rank?

    - by Asad Moeen
    The following are some of my website's details. Home-page: [thebluewaffles].[com] Keywords: Blue Waffles- Rest of the keywords are post/subject specific. Site Description: Health Articles Blog Site Age: 1.5 years A short history: When I started my website, the few things in my mind when posting content were at-least 500 words on each page and writing of all the articles with to the point information. I didn't go really fast with it which is why I only have about 15 articles in 1.5 years. The SEO strategy was more simple. I shared links through Social Marketing websites and some Article Sharing websites after which I could see my website's rankings in top 5 SERP results. I ranked good enough for about 8 months continuously but didn't keep updating content due to which there were some 3 rough months when no content was posted due to some personal work. The SERPS dropped to 2nd page in April and almost started disappearing in May. I asked a lot of people about it and most came up with the reason of "no updates to site" so I started updating my site again since the day, November has almost started and I see no signs of my website's ranking. Another important point is that when I post a new article, and do a title search in Google, I see it ranks good enough for the first 10 hours and then disappears. What could be wrong here?

    Read the article

  • How to serve 410 from Apache rather than a 404

    - by DanSingerman
    On our site, we tend to remove pages a lot where the content has expired, and we want to return http status 410 rather than 404 for requests to pages (physical files) that don't exist on our server (the entire site is made up of static files). We have tried from http://diveintomark.org/archives/2003/03/27/http_error_410_gone but that just breaks our entire site, serving a 410 for every request. We are using Apache 2.2.3

    Read the article

  • Making an advertising server ads from different ad networks

    - by John
    In India there are many ad-networks(other than Adsense) who pay per acquisition or per lead. So Javascript ad code is not required(as fraud clicks don't matter as long as one converts). So an ad network will have many companies and each company will have many banner sizes for ads. Also suddenly any ad may be stopped just because company's target has met. Which is a common nuisance since if we don't remove those url's then that company will get conversions for free. I've a dozen sites and removing the ads are difficult every now and then. Also CPA based ads may not convert at all. That means I'll need to remove non-performing ads regularly. I've gone through: How can I show multiple ad networks on my site? . I've also visited DFP solution but without Adsense they wouldn't let me open account. I want to make an ad server wherein I'll feed new ads (banner image + link for click). I want to maintain categories there like ( shoes, phones, books etc). So if an ad is paused - i'll simply remove/pause the ad there while other ads in the category keep running. Also changing ad code within sites will no more be required. For example - let me have an ad category "clothing" where I can add ads from different companies. So if one of my site requests an ad from there it'll randomly select an ad in this category and return it to site for display. Removing/adding ads within this category will not affect the site requesting those ads. Any idea how to implement it?

    Read the article

  • Google results show .info domain instead of .com

    - by user481913
    I am on shared hosting currently and i registered this account with a .info domain as the main domain.... say MyDomain.info . However, the site runs from MyDomain.com . This is a cpanel based shared hosting account. MyDomain.info has nothing hosted at all... i.e no content files... MyDomain.com is setup as an Add On Domain and run from /public_html/MyDomain under MyDomain.info The problem is that when i type MyDomain as the keyword for search in Google , it shows result(s)for Mydomain.info although this is not the intended site and has no content hosted on itself. I tried to solve the issue by issuing a 301 permanent redirect from MyDomain.info to MyDomain.com, however Google keeps on displaying results as mydomain.info as the main site even after 1 month of the redirect. I want google to index MyDomain.com as the main site and remove MyDomain.info from the results. Also is this harmful from the seo point of view? How can i improve the seo if it is?

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • Adventures in Drupal multisite config with mod_rewrite and clean urls

    - by moexu
    The university where I work is planning to offer Drupal hosting to staff/faculty who want a Drupal site. We've set up Drupal multisite with clean urls and it's mostly working except for some weird redirects. If you have two sites where one is a substring of the other then you'll randomly be redirected to the other site. I tracked the problem to how mod_rewrite does path matching, so with a config file like this: RewriteCond %{REQUEST_URI} ^/drupal RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /drupal/index.php?q=$1 [last,qsappend] RewriteCond %{REQUEST_URI} ^/drupaltest RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /drupaltest/index.php?q=$1 [last,qsappend] /drupaltest will match the /drupal line and all of the links on the /drupaltest page will be rewritten to point to /drupal. If you put the end of string character ($) at the end of each rewrite condition then it will always match on the correct site and the links will always be rewritten correctly. That breaks down as soon as a user logs in though because the query string is appended to the url so just the base url will no longer match. You can also fix the problem by ordering the sites in the config file so that the smallest substring will always be last. I suggested storing all of the sites in a table and then querying, sorting, and rewriting the config file every time a Drupal site is requested so that we could guarantee the order. The system administrator thought that was kludgy and didn't address the root problem. Disabling clean urls should also fix the problem but the users really want them so I'd prefer to keep them if possible. I think we could also fix it by using an .htaccess file in each site to handle the clean url rewriting but that also seems suboptimal since it will generate a higher load on the server and the server is intended to host the majority of the university's external facing web content. Is there some magic I can do with mod_rewrite to get it to work? Would another solution be better? Am I doing something the wrong way to begin with?

    Read the article

  • VMware vSphere 4.1: host performance graphs show "No data available", except the realtime view, which works fine

    - by Graeme Donaldson
    Here's our scenario: Site 1 has 3 hosts, and our vCenter server is here. Site 2 has 3 hosts. All hosts are ESXi 4.1 update 1. If I view the Performance tab for any host in Site 1, I can view realtime, 1 Day, etc., i.e. all the views give me graph data. For the hosts in Site 2, I can view the realtime graphs, 1 Day and 1 Week both say "No data available". 1 Month had mostly nothing, 1 Year shows that it was working fine for a long time and then started breaking. 1 Month view: 1 Year view: What would cause this loss of performance data?

    Read the article

  • Wiki Application With A Reputation System

    - by Christofian
    I'm really impressed with Stack Exchange's concept of reputation (you gain reputation as you post, and the more you post, the more privileges you get), and I want to apply the concept to a wiki that I am building. Does anyone know of a php wiki that has a concept of privileges/reputation similar to Stack Exchange? I'm not necessarily looking for something identical to SE, I'm just looking for a wiki application that gives users more privileges the more they contribute positively to the wiki (SE has down votes, the wiki should have some way of identifying negative contributions too). The privileges should be category based, so the more active you are in a specific category or page, the more privileges you get for that category. There should also be site wide privileges as well, though those should be harder to access than the category privileges. NOTE: If it is not possible to get category wide privileges and site wide privileges, I will be OK with just category wide privileges or just site wide privileges. I should be able to change the requirements for each privilege, through a administration panel or through editing a file (some wiki applications don't have administration interfaces). Does anyone have a script or a solution that will do this? If the script uses something similar to reputation to determine how much a user has positively contributed to the site, then that is OK too. Please Note: I am looking for a way to rate individual user contributions, not a way to rate the quality of an entire page.

    Read the article

  • Advantage of Software Development [on hold]

    - by user93319
    The worth of a brand that a corporation carries is way too high than its physical presence. If you are venturing into an online business, you need to take special care about the corporate image of your business. Nowadays, it is very important for every organization to have its own website. To enhance the online presence of a company it is important to have a good website design as well as the blueprint of the design. A quality site can enhance organizational growth and it can lend a hand an in achieving company’s goals promptly. When websites have gained so much meaning, it is advisable for an organization to seek professional support for the construction of its own exclusive portals. Expert help may again is essential when one needs for a complete renovation on the Web. Any group is necessary to do a bit of introspection before it make a decision to look for the services of a professional web software development company. It is good to be completely clear-headed regarding one’s requirements. To start with, a business should be familiar with who its potential clientele are. Once this main factor has been give consideration, an association can go ahead and get its website designed accordingly. On approaching a corporation that offers software development services India to its customers, a client can make sure that their site is ready with all the most up-to-date features. Professional Assistance Matters Professional service supplier is identified to furnish a site with easy to use features that prompt visitors to come back again and again. Yet one more benefit of receiving aid from professionals is that they can let you know of the type of content that you should place on display over your site. For example, a business that wants to draw the interest of experts belong to the corporate world must make sure that the language used on its website is crisp and official.

    Read the article

  • Apache ProxyPass/ProxyPassReverse to IIS

    - by Dana
    We have an ASP.NET web application which is mapped to a folder on an apache hosted php site using ProxyPass.ProxyPassReverse. A couple of problems being encountered. cookies are being lost which breaks the site navigation, this can be overcome by setting the asp app as cookieless. Forms authentication is used on the ASP site, this is also broken withe the proxypass in place, suspect this is cookie related also. ASP site works ok when run from a domain/ip address. Use of a separate domain / sub-domain is not an option duew to client requirements.

    Read the article

  • "Opportunity" to take over maintenance of a small internal website. What should I do?

    - by Dan
    I have been offered an "opportunity" to take over maintenance of a small internal website run by my group that provides information about schedules and photos of events the groups done. My manager sent me the link to the site and checked it out. The site looked clean and neat but loaded in ~5 seconds. I thought this was a little long considering the site really didn't contain a lot of content. This prompted me to take a look under the hood at the pages source code. To my horror it'd been totally hacked together using nested tables! I'm new so I really can't say no to this "opportunity" so what should I do with it? Every fiber of my being feels that the only correct thing to do is over hall the site using CSS, Div's, Span's and any other appropriate tags that a sane/good web developer would used to begin with instead of depending on the render incentive magic of tables. But I'd like to ask programmers with more experienced then me, who have been in this situation. What should I do? Is my only realistic option to leave the horror as is and only adjusting the content as requested? I'm really torn between good development and the corporate reality I'm part of. Is there some kind of middle ground where things can be made better even if they're not perfect? Thanks ahead of time.

    Read the article

  • FTP restrict user access to a specific folder

    - by Mahdi Ghiasi
    I have created a FTP Site inside IIS 7.5 panel. Now I have access to whole site using administrator username and password. Now, I want to let my friend access a specific folder of that FTP site. (for example, this path: \some\folder\accessible\) I can't create a whole new FTP Site for this purpose, since it says the port is being used by another website. How to create an account for my friend to have access to just an specific folder? P.S: I have read about User Isolation feature of IIS 7.5, but I couldn't find how to create a user just for FTP and set it to a custom path.

    Read the article

  • Can Windows Essentials Remove the alescruf.c Virus from a Windows PC?

    - by jmort253
    A colleague just visited a site which had been hacked by the alescruf.c trojan virus, which infects HTML via embedded JavaScript. Windows PC's are infected by visiting the site. If my colleague has Windows Essentials, which detected and removed the trojan, is there anything else that he needs to do to make sure the virus is no longer active? Windows Essentials detected the trojan within seconds of visiting the site, and he's running a full system scan to see if it still detects anything.

    Read the article

< Previous Page | 206 207 208 209 210 211 212 213 214 215 216 217  | Next Page >