Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 161/990 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • What is /etc/apache2/sites-available used for and is it necessary?

    - by Mariane
    I have 3 sites, each with a specific IP, running on apache2 (up-to-date Ubuntu). To put a site online, I just created a file in: /etc/apache2/sites-enabled and in this file I told apache which directory was the root directory for this site, and to which IP it should correspond. So I have 000-default 001-www.lapf.eu 002-www.felkin.info 003-www.seidhr.fr in this directory. My first site, lapf suddenly lost contact with its database after the domain name was transferred from another registrar unto the registrar who is also hosting the site's data. Then I did an update, and I reinstalled mysql-server and mysql-common, and I did I-have-forgotten-what to reinstall the locales (uft8 and such) which had vanished for some reason. This fixed my first site. Now I noticed that the other 2 sites are offline. Pointing a browser to them just hangs until timeout. They used to function, and their domain names did not move, they are still registered at the same place. The files are still in /etc/apache2/sites-enabled I noticed another directory: /etc/apache2/sites-available with just defaut and default.ssl in it. Why are there 2 directories, sites-enabled and sites-available? Should I copy the files from "sites-enabled" into "sites-available"? Or should I put a modified version of each in "sites-available"? command: "apache2ctl -S" VirtualHost configuration: 92.243.20.169:80 Charlotte (/etc/apache2/sites-enabled/001-www.lapf.eu:1) 92.243.21.141:80 xvm-21-141.ghst.net (/etc/apache2/sites-enabled/002-www.felkin.info:1) 92.243.4.114:80 xvm-4-114.ghst.net (/etc/apache2/sites-enabled/003-www.seidhr.fr:1) wildcard NameVirtualHosts and default servers: *:80 is a NameVirtualHost default server Charlotte (/etc/apache2/sites-enabled/000-default:1) port 80 namevhost Charlotte (/etc/apache2/sites-enabled/000-default:1) Syntax OK

    Read the article

  • I have done everything correct on my asp.net website, re: SEO; why aren't google backlinks showing?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • Multiple sites with the same codebase in Python

    - by Jimmy
    I am trying to run a large amount of sites which share about 90% of their code. They are simply designed to query an API and return the results. They will have a common userbase / database but will be configured slightly different and will have different CSS (perhaps even different templating). My initial idea was to run them as separate applications with a common library but I have read about the sites framework which would allow them to run from a single instance of Django which may help to reduce memory usage. https://docs.djangoproject.com/en/dev/ref/contrib/sites/ Is the site framework the right approach to a problem like this, and does it have real benefits over running separate applications? Initially I thought it was, but now I think otherwise. I have heard the following: Your SITE_ID is set in settings.py, so in order to have multiple sites, you need multiple settings.py configurations, which means multiple distinct processes/instances. You can of course share the code base between them, but each site will need a dedicated worker / WSGIDaemon to serve the site. This effectively removes any benefit of running multiple sites under one hood, if each site needs a UWSGI instance running. Alternative ideas of systems: https://github.com/iivvoo/django_layers https://github.com/shestera/django-multisite I don't know what route to be taking with this.

    Read the article

  • How does one set up API on a locally hosted server...

    - by L33tCh
    I am setting up a personal Wordpress site and want to be able to post to it from other sites... the common request being for my "API Key". When creating a site on Wordpress.com for example, and API key is sent to you by mail, but surely it should be relatively simple, (if not just an address on the local site to point to,) to have one for a personal server (ubuntu server)?

    Read the article

  • In Chrome, is there a way to search only within my bookmarked sites?

    - by McGirl
    I'm not trying to search for a bookmark - I'm trying to search for a term that may appear on one of my bookmarked sites. For instance, if I wanted to search on the term "Obama" and restrict my query to the New York Times web site, I know I could type this into the main URL/Search bar in Chrome: obama site:nytimes.com I'm wondering if there's a way to change that so that it's effectively: obama site:any one of the sites in my Chrome bookmark folder Thanks in advance for your help!

    Read the article

  • Seizing naming master from child domain server

    - by meera
    when I am trying to seize the role from my child domain server the naming master I get the following error fsmo maintenance: seize naming master Attempting safe transfer of domain naming FSMO before seizure. ldap_modify_sW error 0x34(52 (Unavailable). Ldap extended error message is 000020AF: SvcErr: DSID-03210380, problem 5002 (UN AVAILABLE), data 8438 Win32 error returned is 0x20af(The requested FSMO operation failed. The current FSMO holder could not be contacted.) ) Depending on the error code this may indicate a connection, ldap, or role transfer error. Transfer of domain naming FSMO failed, proceeding with seizure ... Server "win-fb20ixk90mu" knows about 5 roles Schema - CN=NTDS Settings,CN=WIN-3918XHC5STU,CN=Servers,CN=Default-First-Site-Na me,CN=Sites,CN=Configuration,DC=HCL,DC=com Naming Master - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First- Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com PDC - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com RID - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com Infrastructure - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First -Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com

    Read the article

  • Title of the page in search results and title of google's cached version are different. Why?

    - by Azmorf
    Check this: http://www.google.com/search?q=site:gunlawsbystate.com+kansas+gun+laws The title of the first result is "Kansas Gun Laws - Gun Laws By State". Although, on the page google has cached the title is different: <title>Kansas Gun Laws - Kansas Gun Law - Reciprocity Guide</title> Google shows the title that has been on the site 2-3 months ago. Google bot has visited the website a lot of times since that, and as you see it even cached it (the latest version is of 15th Sept), however for some reason it doesn't change the title to the new one in the search results. We use hash-bang URL structure on this website. It completely meets google's requirements for AJAX websites (_escaped_fragment_ stuff). The issue I explained is happening with almost all hash-bang pages that got indexed. Questions: Why does it keep old page title in the search results? Can it be connected to the fact that I'm using hash-bang URLs? There are lots of pages on the site that have the same issue, all of them have hash-bang URLs. Another thing I noticed is that Google's "Preview" feature doesn't work for any hash-bang URLs on the site. Did I do anything wrong? It has got cached versions of the pages, why wouldn't it generate a preview? Thanks (and sorry for my English) PS. Here's a weird thing I also noticed: this search query https://www.google.com/search?q=Kansas+Gun+Laws+-+Reciprocity+Guide shows the correct title for the same page as in the example above. Why does google show different titles for the same page when you run different queries?

    Read the article

  • Website not reachable via mobile device

    - by sanders
    I have this very strange problem with a website. The site works normal when coming from a webbrowser which is non-mobile. But when trying to reach the site via a mobile device (tried different browsers) it doesn't get to the website. The only error I see is: This website is not available and the message is: DNS_PROBE_POSSIBLE Does anyone know where I should look for the problem. The website is based on Joomla. The Url of the site is: http://bit.ly/1bOToII

    Read the article

  • How do I prevent spawning of zombie-like apache2 processes on Dreamhost VPS?

    - by Jonathan Hayward
    I have had a website for months or longer on a DreamHost VPS, and I have had vague memories on, in initial setup, having to turn off some customized Apache under /dh to get a standard Apache 2.x to work with. Things have been going along on an even keel, when I started making some changes lately and I found that when I tried to bounce Apache (/usr/sbin/apachectl restart), it couldn't bind to port 80, and my site had been converted from a big literature site to a small parking site. I tried to see what was listening on 80, and it was a DreamHost-customized Apache that had spawned. I killed all of them, restarted Apache, and changed the parent directory under /dh to mode 000. That was a day or two ago. I was bouncing Apache again in trying to get a new site to load under HTTPS, and I found that once again DreamHost's apache had spawned, from the directory I set to mode 000, and once again converted my site to a parking page. I renamed the directory, but I am very skeptical of whether I have permanently killed the DreamHost-customized Apache. Besides duct tape options like a crontab to kill and delete each minute, how can I permanently turn off the Apache processes that are spawning from a location under /dh and interfering with standard Apache? What should I be doing that I am not? Can DreamHost's technical support stop the interference? Thanks,

    Read the article

  • Making internal website available publicly (Win 2008 Server)

    - by endigo
    I have an IIS 7 web site that is running on a Windows 2008 Server (64-bit) VMWare on a Windows 2008 Server (64-bit) Host on my local network. My router is a Firebox XEdge and it has port 80 directed to the IP of the server on VMWare. I can reach the web site from inside the network, but I cannot reach the site from outside the network. I have other web sites that are working through the Firebox, and I am confident that it is configured correctly. I suspect that Windows 2008 server is blocking routed or public addresses, but I have shut down the firewall on the Server that is running on VMWare and the AVG Anti-virus to no avail. How can I make my site available publicly.

    Read the article

  • search engine (bing) lost my www

    - by Jason
    I just found that my site in the result of bing was broken, casue bing display wrong domain name: my site domain name: www.mysite.com; bing list my site domain name : mysite.com How can i ask bing to change it to the right one? Another search engines list it correctly.

    Read the article

  • Google results show .info domain instead of .com

    - by user481913
    I am on shared hosting currently and i registered this account with a .info domain as the main domain.... say MyDomain.info . However, the site runs from MyDomain.com . This is a cpanel based shared hosting account. MyDomain.info has nothing hosted at all... i.e no content files... MyDomain.com is setup as an Add On Domain and run from /public_html/MyDomain under MyDomain.info The problem is that when i type MyDomain as the keyword for search in Google , it shows result(s)for Mydomain.info although this is not the intended site and has no content hosted on itself. I tried to solve the issue by issuing a 301 permanent redirect from MyDomain.info to MyDomain.com, however Google keeps on displaying results as mydomain.info as the main site even after 1 month of the redirect. I want google to index MyDomain.com as the main site and remove MyDomain.info from the results. Also is this harmful from the seo point of view? How can i improve the seo if it is?

    Read the article

  • How to serve 410 from Apache rather than a 404

    - by DanSingerman
    On our site, we tend to remove pages a lot where the content has expired, and we want to return http status 410 rather than 404 for requests to pages (physical files) that don't exist on our server (the entire site is made up of static files). We have tried from http://diveintomark.org/archives/2003/03/27/http_error_410_gone but that just breaks our entire site, serving a 410 for every request. We are using Apache 2.2.3

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • Making an advertising server ads from different ad networks

    - by John
    In India there are many ad-networks(other than Adsense) who pay per acquisition or per lead. So Javascript ad code is not required(as fraud clicks don't matter as long as one converts). So an ad network will have many companies and each company will have many banner sizes for ads. Also suddenly any ad may be stopped just because company's target has met. Which is a common nuisance since if we don't remove those url's then that company will get conversions for free. I've a dozen sites and removing the ads are difficult every now and then. Also CPA based ads may not convert at all. That means I'll need to remove non-performing ads regularly. I've gone through: How can I show multiple ad networks on my site? . I've also visited DFP solution but without Adsense they wouldn't let me open account. I want to make an ad server wherein I'll feed new ads (banner image + link for click). I want to maintain categories there like ( shoes, phones, books etc). So if an ad is paused - i'll simply remove/pause the ad there while other ads in the category keep running. Also changing ad code within sites will no more be required. For example - let me have an ad category "clothing" where I can add ads from different companies. So if one of my site requests an ad from there it'll randomly select an ad in this category and return it to site for display. Removing/adding ads within this category will not affect the site requesting those ads. Any idea how to implement it?

    Read the article

  • Adventures in Drupal multisite config with mod_rewrite and clean urls

    - by moexu
    The university where I work is planning to offer Drupal hosting to staff/faculty who want a Drupal site. We've set up Drupal multisite with clean urls and it's mostly working except for some weird redirects. If you have two sites where one is a substring of the other then you'll randomly be redirected to the other site. I tracked the problem to how mod_rewrite does path matching, so with a config file like this: RewriteCond %{REQUEST_URI} ^/drupal RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /drupal/index.php?q=$1 [last,qsappend] RewriteCond %{REQUEST_URI} ^/drupaltest RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /drupaltest/index.php?q=$1 [last,qsappend] /drupaltest will match the /drupal line and all of the links on the /drupaltest page will be rewritten to point to /drupal. If you put the end of string character ($) at the end of each rewrite condition then it will always match on the correct site and the links will always be rewritten correctly. That breaks down as soon as a user logs in though because the query string is appended to the url so just the base url will no longer match. You can also fix the problem by ordering the sites in the config file so that the smallest substring will always be last. I suggested storing all of the sites in a table and then querying, sorting, and rewriting the config file every time a Drupal site is requested so that we could guarantee the order. The system administrator thought that was kludgy and didn't address the root problem. Disabling clean urls should also fix the problem but the users really want them so I'd prefer to keep them if possible. I think we could also fix it by using an .htaccess file in each site to handle the clean url rewriting but that also seems suboptimal since it will generate a higher load on the server and the server is intended to host the majority of the university's external facing web content. Is there some magic I can do with mod_rewrite to get it to work? Would another solution be better? Am I doing something the wrong way to begin with?

    Read the article

  • Advantage of Software Development [on hold]

    - by user93319
    The worth of a brand that a corporation carries is way too high than its physical presence. If you are venturing into an online business, you need to take special care about the corporate image of your business. Nowadays, it is very important for every organization to have its own website. To enhance the online presence of a company it is important to have a good website design as well as the blueprint of the design. A quality site can enhance organizational growth and it can lend a hand an in achieving company’s goals promptly. When websites have gained so much meaning, it is advisable for an organization to seek professional support for the construction of its own exclusive portals. Expert help may again is essential when one needs for a complete renovation on the Web. Any group is necessary to do a bit of introspection before it make a decision to look for the services of a professional web software development company. It is good to be completely clear-headed regarding one’s requirements. To start with, a business should be familiar with who its potential clientele are. Once this main factor has been give consideration, an association can go ahead and get its website designed accordingly. On approaching a corporation that offers software development services India to its customers, a client can make sure that their site is ready with all the most up-to-date features. Professional Assistance Matters Professional service supplier is identified to furnish a site with easy to use features that prompt visitors to come back again and again. Yet one more benefit of receiving aid from professionals is that they can let you know of the type of content that you should place on display over your site. For example, a business that wants to draw the interest of experts belong to the corporate world must make sure that the language used on its website is crisp and official.

    Read the article

  • VMware vSphere 4.1: host performance graphs show "No data available", except the realtime view, which works fine

    - by Graeme Donaldson
    Here's our scenario: Site 1 has 3 hosts, and our vCenter server is here. Site 2 has 3 hosts. All hosts are ESXi 4.1 update 1. If I view the Performance tab for any host in Site 1, I can view realtime, 1 Day, etc., i.e. all the views give me graph data. For the hosts in Site 2, I can view the realtime graphs, 1 Day and 1 Week both say "No data available". 1 Month had mostly nothing, 1 Year shows that it was working fine for a long time and then started breaking. 1 Month view: 1 Year view: What would cause this loss of performance data?

    Read the article

  • Wiki Application With A Reputation System

    - by Christofian
    I'm really impressed with Stack Exchange's concept of reputation (you gain reputation as you post, and the more you post, the more privileges you get), and I want to apply the concept to a wiki that I am building. Does anyone know of a php wiki that has a concept of privileges/reputation similar to Stack Exchange? I'm not necessarily looking for something identical to SE, I'm just looking for a wiki application that gives users more privileges the more they contribute positively to the wiki (SE has down votes, the wiki should have some way of identifying negative contributions too). The privileges should be category based, so the more active you are in a specific category or page, the more privileges you get for that category. There should also be site wide privileges as well, though those should be harder to access than the category privileges. NOTE: If it is not possible to get category wide privileges and site wide privileges, I will be OK with just category wide privileges or just site wide privileges. I should be able to change the requirements for each privilege, through a administration panel or through editing a file (some wiki applications don't have administration interfaces). Does anyone have a script or a solution that will do this? If the script uses something similar to reputation to determine how much a user has positively contributed to the site, then that is OK too. Please Note: I am looking for a way to rate individual user contributions, not a way to rate the quality of an entire page.

    Read the article

  • Apache ProxyPass/ProxyPassReverse to IIS

    - by Dana
    We have an ASP.NET web application which is mapped to a folder on an apache hosted php site using ProxyPass.ProxyPassReverse. A couple of problems being encountered. cookies are being lost which breaks the site navigation, this can be overcome by setting the asp app as cookieless. Forms authentication is used on the ASP site, this is also broken withe the proxypass in place, suspect this is cookie related also. ASP site works ok when run from a domain/ip address. Use of a separate domain / sub-domain is not an option duew to client requirements.

    Read the article

  • Mixed sessions with Classic ASP on IIS 7.5 and Windows 2008 R2 64 bit

    - by Marcin
    Recently had an issues with a server upgrade from IIS 6 on Windows 2003 to IIS 7.5 on Windows 2008 R2 64 bit. We have a number of websites running on Classic ASP. All the sites sit under a particular site, e.g. www.example.com/foo and www.example.com/foobar. On IIS 6 each site was set up as a virtual directory and things worked fine. Since moving to the new set up, a lot of websites seem to have mixed Sessions. To be clear, this is not a app pool recycling issue; rather the sessions are populated with information when the user hits the site and while browsing they get sessions from different sites. We've determined this based on - a few customers called up and reported having their shopping cart with items with names of items belonging to a different site - also our own testing showed that some queries being run would try to bring products in from a different site We've tried - disabling dynamic caching - converting each site to be a virtual application (if I understand correctly, the virtual directory/application concepts were changed/refined somewhat in IIS 7 although to be honest, I'm not clear what the difference is) - various application pool changes (using .NET 2 framework), classic and integrated modes, changing the Process model to NetworkIdentity), all to no avail. The only thing we haven't tried is changing it to run as a 32 bit application. We're not using http only cookies, so when I open up a browser and type document.cookie into the dev console in Firefox/Chrome/IE that there will be multiple ASPSESSIONID=... values whereas previously I believe there was only one. Finally, we use server side JScript for the classic ASP pages, not VBScript, so we have code similar to the below. //the user's login account as a jscript object Session("user") = { email : "[email protected]", id : 123 }; and if we execute a line of code like below: Response.Write( typeof(Session("user")) ); When things are running correctly, we get "object" - as expected. When the Session gets trashed, the output is "unknown" and we are also unable to access the fields within the JScript object (e.g. the .email or .id fields). Much appreciated if anyone can provide any pointers about how to resolve this, everything on google seems to point to different issues.

    Read the article

  • Unextending Sharepoint 2007 Web Application from a zone

    - by dunxd
    When our Sharepoint was migrated from Sharepoint 2003 to Sharepoint 2007 (both fully paid versions), the consultants who carried it out extended each web app into two IIS sites/zones (e.g. the original Web App was http://intranet, then http://newintranet and http://intranet would be created for Sharepoint 2007 - each with its own IIS site). The idea was that during the migration period we would set up DNS to point the old url to SP2003 servers and the new one to SP2007, then once the migration was complete, do a DNS change so the SP2007 would recieve the requests to the http://intranet type URLs. Unfortunately the contractors did not tidy up the application extensions and IIS sites after the migration, and for some time both URLs were in use, resulting in many document links pointing to the http://newintranet type URLs. This means I need to maintain these URLs. Due to a rejig of organisation structure we now need to relocate some Sharepoint sites, and I'd like to use the RDA Collaboration Sharepoint URL Redirector feature. However a limitation of this is that it doesn't work for Web Applications which have been extended into multiple zones. So I have a need to tidy up the situation that our consultants left behind. I think the right thing to do is use the "Remove Sharepoint from IIS Web Site" page in Central Admin to remove the zone for the newintranet type sites, and select the option to also delete the IIS site. That should result in having no IIS sites listening for http://newintranet type URLs. Is this the right procedure? Once I have done that I need to set up Sharepoint to receive requests sent to the http://newintranet type URLs so they will continue to work. I am not sure if I should do this: using Alternative Access Mappings or, by adding a host header to the IIS site or, creating a non Sharepoint IIS site for each http://newintranet type URL, and use IIS redirection to forward the requests to the new URL using variables to pass the path to the Sharepoint site. Does anyone have any thoughts on these options, or any other way of achieving this? Sharepoint 2007 is running on Windows 2003 with IIS6. We don't currently have plans/budget to upgrade to Sharepoint 2010.

    Read the article

  • Can Windows Essentials Remove the alescruf.c Virus from a Windows PC?

    - by jmort253
    A colleague just visited a site which had been hacked by the alescruf.c trojan virus, which infects HTML via embedded JavaScript. Windows PC's are infected by visiting the site. If my colleague has Windows Essentials, which detected and removed the trojan, is there anything else that he needs to do to make sure the virus is no longer active? Windows Essentials detected the trojan within seconds of visiting the site, and he's running a full system scan to see if it still detects anything.

    Read the article

  • FTP restrict user access to a specific folder

    - by Mahdi Ghiasi
    I have created a FTP Site inside IIS 7.5 panel. Now I have access to whole site using administrator username and password. Now, I want to let my friend access a specific folder of that FTP site. (for example, this path: \some\folder\accessible\) I can't create a whole new FTP Site for this purpose, since it says the port is being used by another website. How to create an account for my friend to have access to just an specific folder? P.S: I have read about User Isolation feature of IIS 7.5, but I couldn't find how to create a user just for FTP and set it to a custom path.

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >