Search Results

Search found 8020 results on 321 pages for 'webcenter sites'.

Page 34/321 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • Où se situent les sites web les plus populaires au monde ? Un rapport révèle qu'ils sont concentrés aux États-Unis, la France ferme le top 5

    Où se situent les sites web les plus populaires au monde ? Un rapport révèle qu'ils sont concentrés aux États-Unis, la France ferme le top 5 Un récent rapport de la firme Host Cabi réalisé avec un outil gratuit de géolocalisation de sites web les plus populaires (classés par Alexa), révèle l'emplacement de ceux-ci.Le rapport intitulé « Où sont hébergés les 100 000 sites web les plus populaires au monde ? » permet de se rendre compte qu'ils sont partagés entre 1 204 hébergeurs de 621 villes de...

    Read the article

  • django cross-site reverse a url

    - by tutuca
    I have a similar question than django cross-site reverse. But i think I can't apply the same solution. I'm creating an app that lets the users create their own site. After completing the signup form the user should be redirected to his site's new post form. Something along this lines: new_post_url = 'http://%s.domain:9292/manage/new_post %site.domain' logged_user = authenticate(username=user.username, password=user.password) if logged_user is not None: login(request, logged_user) return redirect(new_product_url) Now, I know that "new_post_url" is awful and makes babies cry so I need to reverse it in some way. I thought in using django.core.urlresolvers.reverse to solve this but that only returns urls on my domain, and not in the user's newly created site, so it doesn't works for me. So, do you know a better/smarter way to solve this?

    Read the article

  • Can we run windowservice or EXE in Azure website or in Virtual Machine?

    - by Arun Rana
    I have experienced with cloud service/hosted service on Azure. However regarding another project i am confused in selection in terms of functionalities. I have project (2 tier asp.net app) with that i need to run windowservice or exe which will do some functionality every day (like fetch data) so my confusions are as below Regarding Azure website Can i access RDP if i'll move to reserved instance? can i run windowservice/exe ? Regarding Virtual Machine Is it same as dedicated server? can i use WASD as database from application reside in same? I think i can run any exe and installed anything however azure is going to recycle this and if yes then what happened on recycling? can i use new window server 2012 (VHD) in that? Azure website & VM both are in preview mode so is it reliable to use it as production version?

    Read the article

  • How to setup custom DNS with Azure Websites Preview?

    - by husainnz
    I created a new Azure Website, using Umbraco as the CMS. I got a page up and going, and I already have a .co.nz domain with www.domains4less.com. There's a whole lot of stuff on the internet about pointing URLs to Azure, but that seems to be more of a redirection service than anything (i.e. my URLs still use azurewebsites.net once I land on my site). Has anybody had any luck getting it to go? Here's the error I get when I try adding the DNS entry to Azure (I'm in reserved mode, reemdairy is the name of the website): There was an error processing your request. Please try again in a few moments. Browser: 5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5 User language: undefined Portal Version: 6.0.6002.18488 (rd_auxportal_stable.120609-0259) Subscriptions: 3aabe358-d178-4790-a97b-ffba902b2851 User email address: [email protected] Last 10 Requests message: Failure: Ajax call to: Websites/UpdateConfig. failed with status: error (500) in 2.57 seconds. x-ms-client-request-id was: 38834edf-c9f3-46bb-a1f7-b2839c692bcf-2012-06-12 22:25:14Z dateTime: Wed Jun 13 2012 10:25:17 GMT+1200 (New Zealand Standard Time) durationSeconds: 2.57 url: Websites/UpdateConfig status: 500 textStatus: error clientMsRequestId: 38834edf-c9f3-46bb-a1f7-b2839c692bcf-2012-06-12 22:25:14Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com response: {"message":"Try again. Contact support if the problem persists.","ErrorMessage":"Try again. Contact support if the problem persists.","httpStatusCode":"InternalServerError","operationTrackingId":"","stackTrace":null} message: Complete: Ajax call to: Websites/GetConfig. completed with status: success (200) in 1.021 seconds. x-ms-client-request-id was: a0cdcced-13d0-44e2-866d-e0b061b9461b-2012-06-12 22:24:43Z dateTime: Wed Jun 13 2012 10:24:44 GMT+1200 (New Zealand Standard Time) durationSeconds: 1.021 url: Websites/GetConfig status: 200 textStatus: success clientMsRequestId: a0cdcced-13d0-44e2-866d-e0b061b9461b-2012-06-12 22:24:43Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: https://manage.windowsazure.com/Service/OperationTracking?subscriptionId=3aabe358-d178-4790-a97b-ffba902b2851. completed with status: success (200) in 1.887 seconds. x-ms-client-request-id was: a7689fe9-b9f9-4d6c-8926-734ec9a0b515-2012-06-12 22:24:40Z dateTime: Wed Jun 13 2012 10:24:42 GMT+1200 (New Zealand Standard Time) durationSeconds: 1.887 url: https://manage.windowsazure.com/Service/OperationTracking?subscriptionId=3aabe358-d178-4790-a97b-ffba902b2851 status: 200 textStatus: success clientMsRequestId: a7689fe9-b9f9-4d6c-8926-734ec9a0b515-2012-06-12 22:24:40Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: /Service/GetUserSettings. completed with status: success (200) in 0.941 seconds. x-ms-client-request-id was: 805e554d-1e2e-4214-afd5-be87c0f255d1-2012-06-12 22:24:40Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.941 url: /Service/GetUserSettings status: 200 textStatus: success clientMsRequestId: 805e554d-1e2e-4214-afd5-be87c0f255d1-2012-06-12 22:24:40Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/ClusterSuffix. completed with status: success (200) in 0.483 seconds. x-ms-client-request-id was: 85157ceb-c538-40ca-8c1e-5cc07c57240f-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.483 url: Extensions/ApplicationsExtension/SqlAzure/ClusterSuffix status: 200 textStatus: success clientMsRequestId: 85157ceb-c538-40ca-8c1e-5cc07c57240f-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/GetClientIp. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: 2eb194b6-66ca-49e2-9016-e0f89164314c-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/GetClientIp status: 200 textStatus: success clientMsRequestId: 2eb194b6-66ca-49e2-9016-e0f89164314c-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/DefaultServerLocation. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: 1bc165ef-2081-48f2-baed-16c6edf8ea67-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/DefaultServerLocation status: 200 textStatus: success clientMsRequestId: 1bc165ef-2081-48f2-baed-16c6edf8ea67-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/ServerLocations. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: e1fba7df-6a12-47f8-9434-bf17ca7d93f4-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/ServerLocations status: 200 textStatus: success clientMsRequestId: e1fba7df-6a12-47f8-9434-bf17ca7d93f4-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com

    Read the article

  • How to find out top links in a website?

    - by Anil
    I want to know what are the best links in a site are? It may be pagerank wise or popularity wise. For example http://www.pragprog.com is a site. I want to find out what are the most relevant links this site has. The links should not be external pointing links. It should be of the same site. Do google or any similar site can tell such information?

    Read the article

  • What happens when you add/remove current site as trusted site?

    - by kasey
    What happens when you add/remove the current site, while logged on, as a trusted site? When users do this on our website, and then try to click on a link or close the browser, they get the following JavaScript exception: "Microsoft JScript runtime error: 'type' is null or not an object" in the below library code at the line "var etype = this.type = e.type.toLowerCase();" Sys.UI.DomEvent = function Sys$UI$DomEvent(eventObject) { /// <summary locid="M:J#Sys.UI.DomEvent.#ctor" /> /// <param name="eventObject"></param> /// <field name="altKey" type="Boolean" locid="F:J#Sys.UI.DomEvent.altKey"></field> /// <field name="button" type="Sys.UI.MouseButton" locid="F:J#Sys.UI.DomEvent.button"></field> /// <field name="charCode" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.charCode"></field> /// <field name="clientX" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.clientX"></field> /// <field name="clientY" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.clientY"></field> /// <field name="ctrlKey" type="Boolean" locid="F:J#Sys.UI.DomEvent.ctrlKey"></field> /// <field name="keyCode" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.keyCode"></field> /// <field name="offsetX" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.offsetX"></field> /// <field name="offsetY" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.offsetY"></field> /// <field name="screenX" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.screenX"></field> /// <field name="screenY" type="Number" integer="true" locid="F:J#Sys.UI.DomEvent.screenY"></field> /// <field name="shiftKey" type="Boolean" locid="F:J#Sys.UI.DomEvent.shiftKey"></field> /// <field name="target" locid="F:J#Sys.UI.DomEvent.target"></field> /// <field name="type" type="String" locid="F:J#Sys.UI.DomEvent.type"></field> var e = Function._validateParams(arguments, [ {name: "eventObject"} ]); if (e) throw e; var e = eventObject; var etype = this.type = e.type.toLowerCase(); this.rawEvent = e; this.altKey = e.altKey; if (typeof(e.button) !== 'undefined') { this.button = (typeof(e.which) !== 'undefined') ? e.button : (e.button === 4) ? Sys.UI.MouseButton.middleButton : (e.button === 2) ? Sys.UI.MouseButton.rightButton : Sys.UI.MouseButton.leftButton; } if (etype === 'keypress') { this.charCode = e.charCode || e.keyCode; } else if (e.keyCode && (e.keyCode === 46)) { this.keyCode = 127; } else { this.keyCode = e.keyCode; } this.clientX = e.clientX; this.clientY = e.clientY; this.ctrlKey = e.ctrlKey; this.target = e.target ? e.target : e.srcElement; if (!etype.startsWith('key')) { if ((typeof(e.offsetX) !== 'undefined') && (typeof(e.offsetY) !== 'undefined')) { this.offsetX = e.offsetX; this.offsetY = e.offsetY; } else if (this.target && (this.target.nodeType !== 3) && (typeof(e.clientX) === 'number')) { var loc = Sys.UI.DomElement.getLocation(this.target); var w = Sys.UI.DomElement._getWindow(this.target); this.offsetX = (w.pageXOffset || 0) + e.clientX - loc.x; this.offsetY = (w.pageYOffset || 0) + e.clientY - loc.y; } } this.screenX = e.screenX; this.screenY = e.screenY; this.shiftKey = e.shiftKey; } Note: the site does not require trusted privileges to function correctly.

    Read the article

  • Managing multiple blogs (wordpress) from one control panel

    - by thisMayhem
    This is the deal: I have more than one blog (not running yet) that I want to install in subfolders for one of my domains. Each blog will obviously have its own look and feel and content. I could just have a set of control panels and administer them individually or I could somehow manage them form a single control panel. Issues I've come across are: Some plugins offer this functionality while suppressing other features like auto draf saving, in-post category and tag creation, image upload etc. I'm not necessarily looking for a ready solution, more like some guidance on how to approach this particular problem in the best possible manner. I'm looking for a fully working scenario, not some plugin that allows me to more or less simulate the one-blog-administration experience.

    Read the article

  • acts_as_ferret with multiple hosts

    - by Nick
    I've got everything working with ferret and acts_as_ferret for development (or localhost DRb), but I can't get my multiple host deployment working. All of the remote systems get ECONNREFUSED when accessing the port. On the ferret server, the daemon is listening on localhost only despite the configuration listing the FQDN as the host. I also tried switching to a UNIX socket to share data between the ferret DRb daemon and the app code but it too gets ECONNREFUSED. (The socket is available to all of the machines via an NFS mount). Is there a better way to do this or should I be looking for another search indexer? Thanks.

    Read the article

  • apache renew ssl not working [on hold]

    - by Varun S
    Downloaded a new ssl cert from go daddy and installed the cert on apache2 server put the cert in /etc/ssl/certs/ folder put the gd_bundle.crt in the /etc/ssl/ folder private key is in /etc/ssl/private/private.key I just replaced the original files with the new files, did not replace the private key. I restarted the server but the website is still showing old certificated date. What am I doing wrong and how do i resolve it ? my httpd.conf file is empty, the certificated config is in the sites-enabled/default-ssl file the server is apache2 running ubuntu 14.04 os SSLEngine on # A self-signed (snakeoil) certificate can be created by installing # the ssl-cert package. See # /usr/share/doc/apache2.2-common/README.Debian.gz for more info. # If both key and certificate are stored in the same file, only the # SSLCertificateFile directive is needed. SSLCertificateFile /etc/ssl/certs/2b1f6d308c2f9b.crt SSLCertificateKeyFile /etc/ssl/private/private.key # Server Certificate Chain: # Point SSLCertificateChainFile at a file containing the # concatenation of PEM encoded CA certificates which form the # certificate chain for the server certificate. Alternatively # the referenced file can be the same as SSLCertificateFile # when the CA certificates are directly appended to the server # certificate for convinience. SSLCertificateChainFile /etc/ssl/gd_bundle.crt -rwxr-xr-x 1 root root 1944 Aug 16 06:34 /etc/ssl/certs/2b1f6d308c2f9b.crt -rwxr-xr-x 1 root root 3197 Aug 16 06:10 /etc/ssl/gd_bundle.crt -rw-r--r-- 1 root root 1679 Oct 3 2013 /etc/ssl/private/private.key /etc/apache2/sites-available/default-ssl: # SSLCertificateFile directive is needed. /etc/apache2/sites-available/default-ssl: SSLCertificateFile /etc/ssl/certs/2b1f6d308c2f9b.crt /etc/apache2/sites-available/default-ssl: SSLCertificateKeyFile /etc/ssl/private/private.key /etc/apache2/sites-available/default-ssl: # Point SSLCertificateChainFile at a file containing the /etc/apache2/sites-available/default-ssl: # the referenced file can be the same as SSLCertificateFile /etc/apache2/sites-available/default-ssl: SSLCertificateChainFile /etc/ssl/gd_bundle.crt /etc/apache2/sites-enabled/default-ssl: # SSLCertificateFile directive is needed. /etc/apache2/sites-enabled/default-ssl: SSLCertificateFile /etc/ssl/certs/2b1f6d308c2f9b.crt /etc/apache2/sites-enabled/default-ssl: SSLCertificateKeyFile /etc/ssl/private/private.key /etc/apache2/sites-enabled/default-ssl: # Point SSLCertificateChainFile at a file containing the /etc/apache2/sites-enabled/default-ssl: # the referenced file can be the same as SSLCertificateFile /etc/apache2/sites-enabled/default-ssl: SSLCertificateChainFile /etc/ssl/gd_bundle.crt

    Read the article

  • Can´t verify my site on Google (error 403 Forbidden). I have other sites in the same host with no problems whatsoever

    - by Rosamunda Rosamunda
    I can´t verify my site on Google. I´ve done this several times for several sites, all inside the same host. I´ve tried the HTML tag method, HTML upload, Domain Name provider (I canp´t find the options that Google tell me that I should activate...), and Google Analytics. I always get this response: Verification failed for http://www.mysite.com/ using the Google Analytics method (1 minute ago). Your verification file returns a status of 403 (Forbidden) instead of 200 (OK). I´ve checked the server headers, and I get this result: REQUESTING: http://www.mysite.com GET / HTTP/1.1 Connection: Keep-Alive Keep-Alive: 300 Accept:/ Host: www.mysite.com Accept-Language: en-us Accept-Encoding: gzip, deflate User-Agent: Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0) SERVER RESPONSE: HTTP/1.1 403 Forbidden Date: Wed, 19 Sep 2012 03:25:22 GMT Server: Apache/2.2.19 (Unix) mod_ssl/2.2.19 OpenSSL/0.9.8e-fips-rhel5 mod_bwlimited/1.4 PHP/5.2.17 Connection: close Content-Type: text/html; charset=iso-8859-1 Final Destination Page (It shows my actual homepage). What can I do? The hosting is the very same as in my other sites, where I didn´t have any issue at all! Thanks for your help! Note: As I have a Drupal 7 site, I´ve tried a "Drupal solution" first, but haven´t found any that solved this issue... How can it be forbidden when I can access the link perfectly ok? Is there any solution to this? Thanks!

    Read the article

  • How do I find information on who links to my sites?

    - by bobdobbs
    I'm trying to figure out if there's a free way to get information on backlinks to my site. I've had webmaster tools and google analytics set up for years. But I can't find access to data about site backlinks in either toolset. Webmaster tools, under 'traffic'-'links to your site' gives me the same message for all of my sites: "No data available". I haven't been able to find anything in GA that gives any information on backlinks. I've heard of using "links:" as an operator in google search, but for each of my sites, this returns either zero or very few results in cases when I know I have many backlinks. Most of the links simple aren't shown. My thinking is that google maintains a graph of who links to my site, so I figured that they might let me see it. But I can't figure out how. I've found this tool on a spammy website: http://www.backlinkwatch.com. It offers more data than google on my backlines, and offers more results in exchange for a paid subscription. The data it offers for free looks good, but the results are limited and the site has popups and obnoxious ads. So, in short: how do I get data on who links to me? Is there a free way?

    Read the article

  • WHY Google does not ban these sites using this SEO pattern? [on hold]

    - by saddam.bg
    I have seen some sites using a different kind of SEO to promote copyrighted materials such as movies. They also have submitted their site to Google webmaster tools but still now did not get banned. Their Alexa ranks are 7000 or less. On the other hand I have run 5 movie affiliate sites and all of them got banned by Google within a short period of time. I have copied the url of the homepage of solarmovie.me and pasted it on the google search and instead of the homepage url I have seen that their category or tag shows as the homepage (www.solarmovie.me/watch-category/hollyw... Now is solarmovie.me publishing its posts as a single page or something else? I tried to find out what kind of SEO or coding that was, but I couldn't since I have very little knowledge about coding. Also I have seen the same thing with ALLUC.TO in google search (www.alluc.to/popular-links.html). Could anyone please help with the SEO of this kind so that I don't get banned by google frequently or index removed. All SEO webmaster i need your help!!!! Please give me some good tips for this type of SEO. Thank You Very Much

    Read the article

  • Why does SEO based code tips not appear to affect ranking?

    - by Ben
    I've been researching various methods for SEO where pages have precise titles, keywords are highlighted with h tags and tick the many boxes stated in good page mark up for SEO. However when looking at some top ranked search sites on google for key terms they have terrible SEO based mark up. Really long page titles, no tags, limited appearance of keywords in the text and so on. SEO analysis services rate them lower than other sites, yet these sites rank really high. Even with a low number of back-links they are high, so I don't understand how these sites earn the position when they appear inferior to those below them which have better mark up and links. I don't want to cause trouble my mentioning sites or keywords etc. but looking in google at 'executive search' the roughly 5th placed site makes no sense why it should be highly rank, especially with all the added .swfs. The same applies for the top of 'Japan Executive Search'. My main point is that these sites seem to not have all the important structural rules stated in seo page rating applications and general suggested best practice, nor do they show large back-links. It makes me feel like there is no point bothering to write decent mark up if it really doesn't matter. Can anyone explain how sites with such mark-up, and low back-links can outrank well written and structured sites with greater linkage? Sorry if this is a fuzzy question, I want to avoid singling out any sites for example, but it really has me perplexed that sites which appear to ignore the suggested best practices rank so well.

    Read the article

  • How can I prevent spam on sites which I control?

    - by danlefree
    This is a general, community wiki question to address all non-specific spam prevention questions. If your question was closed as a duplicate of this question and you feel that the information provided here does not provide a sufficient answer, please open a discussion on Pro Webmasters Meta. For purposes of this question, spam will include: Any automated post Manually-posted content which includes links to spammers' sites Manually-posted content which includes instructions to visit a spammer's site

    Read the article

  • Can anybody recommend C#/XAML Windows Store Development aggregator sites?

    - by Clay Shannon
    I used to have a couple of sites bookmarked that were Windows development article/blog post aggregators. I can't recall what they were called. What I want to do now is to keep up with all relevant C#/XAML "Windows Store" app development info, whether it be blog posts, new "Metro"-specific channel 9 videos, etc., without spending lots of time surfing about. Can anybody recommend any "C#/XAML Windows Store new information aggregators"?

    Read the article

  • Does Wordpress have any SEO benifits on sites coded from scratch? [closed]

    - by Glycan
    Possible Duplicate: Do wordpress websites get indexed quicker by SE than a regular website? I have heard SEO experts suggest moving sites to Wordpress for SEO optimization. Googling offers some perspectives that confirm this (like this or somewhat more dubiously, this). Is this accurate? Some other places say that it doesn't matter, except that Wordpress configures all the meta correctly. In that case, what exactly must be configured?

    Read the article

  • Are there similar sites to jsdo.it and jsfiddle.net? [closed]

    - by TK Kocheran
    I'm looking for sites similar to jsdo.it and jsfiddle.net. I really like jsdo.it, as it provides a nice editor and the ability to fork, but I really can't stand the usability as it really gets to be a pain to use after a while. I also like jsfiddle.net, but it too has some usability issues and the URL for your fiddle changes every time you save. Are there any other good alternatives for in-browser HTML testing and development?

    Read the article

  • Mozilla propose au W3C des icônes pour améliorer la confidentialité des données sur le Net, reste à convaincre les sites de les adopter

    Mozilla propose au W3C des icônes pour améliorer la confidentialité Des données sur le Net, reste à convaincre les entreprises de les intégrer à leurs sites Des icônes pour alerter l'internaute sur l'utilisation potentielle de ses données privées, c'est ce que prépare la fondation Mozilla qui a organisé un groupe de travail pour formuler une proposition au W3C. Aujourd'hui, ces icônes passent en version alpha et Aza Raskin, expert des interfaces hommes machine (à l'origine du concept de Tab Candy sur Firefox notamme...

    Read the article

  • Apache mod_rewrite driving me mad

    - by WishCow
    The scenario I have a webhost that is shared among multiple sites, the directory layout looks like this: siteA/ - css/ - js/ - index.php siteB/ - css/ - js/ - index.php siteC/ . . . The DocumentRoot is at the top level, so, to access siteA, you type http://webhost/siteA in your browser, to access siteB, you type http://webhost/siteB, and so on. Now I have to deploy my own site, which was designed with having 3 VirtualHosts in mind, so my structure looks like this: siteD/ - sites/sitename.com/ - log/ - htdocs/ - index.php - sites/static.sitename.com - log/ - htdocs/ - css - js - sites/admin.sitename.com - log/ - htdocs/ - index.php As you see, the problem is that my index.php files are not at the top level directory, unlike the already existing sites on the webhost. Each VirtualHost should point to the corresponding htdocs/ folder: http://siteD.com -> siteD/sites/sitename.com/htdocs http://static.siteD.com -> siteD/sites/static.sitename.com/htdocs http://admin.siteD.com -> siteD/sites/admin.sitename.com/htdocs The problem I cannot have VirtualHosts on this host, so I have to emulate it somehow, possibly with mod_rewrite. The idea Have some predefined parts in all of the links on the site, that I can identify, and route accordingly to the correct file, with mod_rewrite. Examples: http://webhost/siteD/static/js/something.js -> siteD/sites/static.sitename.com/htdocs/js/something.js http://webhost/siteD/static/css/something.css -> siteD/sites/static.sitename.com/htdocs/css/something.css http://webhost/siteD/admin/something -> siteD/sites/admin.sitename.com/htdocs/index.php http://webhost/siteD/admin/sub/something -> siteD/sites/admin.sitename.com/htdocs/index.php http://webhost/siteD/something -> siteD/sites/sitename.com/htdocs/index.php http://webhost/siteD/sub/something -> siteD/sites/sitename.com/htdocs/index.php Anything that starts with http://url/sitename/admin/(.*) will get rewritten, to point to siteD/sites/admin.sitename.com/htdocs/index.php Anything that starts with http://url/sitename/static/(.*) will get rewritten, to point to siteD/sites/static.sitename.com/htdocs/$1 Anything that starts with http://url/sitename/(.*) AND did not have a match already from above, will get rewritten to point to siteD/sites/sitename.com/htdocs/index.php The solution Here is the .htaccess file that I've come up with: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} ^/siteD/static/(.*)$ [NC] RewriteRule ^siteD/static/(.*)$ siteD/sites/static/htdocs/$1 [L] RewriteCond %{REQUEST_URI} ^/siteD/admin/(.*)$ [NC] RewriteRule ^siteD/(.*)$ siteD/sites/admin/htdocs/index.php [L,QSA] So far, so good. It's all working. Now to add the last rule: RewriteCond %{REQUEST_URI} ^/siteD/(.*)$ [NC] RewriteRule ^siteD/(.*)$ siteD/sites/public/htdocs/index.php [L,QSA] And it's broken. The last rule catches everything, even the ones that have static/ or admin/ in them. Why? Shouldn't the [L] flag stop the rewriting process in the first two cases? Why is the third case evaluated? Is there a better way of solving this? I'm not sticking to rewritemod, anything is fine as long as it does not need access to server-level config. I don't have access to RewriteLog, or anything like that. Please help :(

    Read the article

  • How do I host multiple independent, secured SharePoint sites (WSS 3.0) without using Active Directory on the same server?

    - by Kyle Noland
    I have a SharePoint site set up on one of my networks to service Active Directory users. To be clear, this is a Windows SharePoint Services 3.0 installation running on Windows Server 2003 Standard. It is not an option to upgrade the server or SharePoint version. Management would like to create several new sites, one for each of a handful of clients. These sites will be used like "dropboxes" or FTP sites so that my company can make large files available to outside contacts, and vice versa. Here are my requirements: I do not want to have to create Active Directory accounts for each external contact. If possible, I would like to store the external usernames and passwords in a database that I can write a small GUI for so that management can handle adding their own external contacts. Each client site must be sandboxed from each other and from my main company SharePoint site. I would like to keep everything running on port 80 and be able to access the sites as either clientname.mycompany.com or www.mycompany.com/clientname If anybody has ever done this I would really appreciate hearing about any lessons you learned and suggestions for how to set this up. Kyle

    Read the article

  • Why can't I route to some sites from my MacBook Pro that I can see from my iPad?

    - by Robert Atkins
    I am on M1 Cable (residential) broadband in Singapore. I have an intermittent problem routing to some sites from my MacBook Pro—often Google-related sites (arduino.googlecode.com and ajax.googleapis.com right now, but sometimes even gmail.com.) This prevents StackExchange chat from working, for instance. Funny thing is, my iPad can route to those sites and they're on the same wireless network! I can ping the sites, but not traceroute to them which I find odd. That I can get through via the iPad implies the problem is with the MBP. In any case, calling M1 support is... not helpful. I get the same behaviour when I bypass the Airport Express entirely and plug the MBP directly into the cable modem. Can anybody explain a) how this is even possible and b) how to fix it? mella:~ ratkins$ ping ajax.googleapis.com PING googleapis.l.google.com (209.85.132.95): 56 data bytes 64 bytes from 209.85.132.95: icmp_seq=0 ttl=50 time=11.488 ms 64 bytes from 209.85.132.95: icmp_seq=1 ttl=53 time=13.012 ms 64 bytes from 209.85.132.95: icmp_seq=2 ttl=53 time=13.048 ms ^C --- googleapis.l.google.com ping statistics --- 3 packets transmitted, 3 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 11.488/12.516/13.048/0.727 ms mella:~ ratkins$ traceroute ajax.googleapis.com traceroute to googleapis.l.google.com (209.85.132.95), 64 hops max, 52 byte packets traceroute: sendto: No route to host 1 traceroute: wrote googleapis.l.google.com 52 chars, ret=-1 *traceroute: sendto: No route to host traceroute: wrote googleapis.l.google.com 52 chars, ret=-1 ^C mella:~ ratkins$ The traceroute from the iPad goes (and I'm copying this by hand): 10.0.1.1 119.56.34.1 172.20.8.222 172.31.253.11 202.65.245.1 202.65.245.142 209.85.243.156 72.14.233.145 209.85.132.82 From the MBP, I can't traceroute to any of the IPs from 172.20.8.222 onwards. [For extra flavour, not being able to access the above appears to stop me logging in to Server Fault via OpenID and formatting the above traceroutes correctly. Anyone with sufficient rep here to do so, I'd be much obliged.]

    Read the article

  • Why can't I route to some sites from my MacBook Pro that I can see from my iPad? [closed]

    - by Robert Atkins
    I am on M1 Cable (residential) broadband in Singapore. I have an intermittent problem routing to some sites from my MacBook Pro—often Google-related sites (arduino.googlecode.com and ajax.googleapis.com right now, but sometimes even gmail.com.) This prevents StackExchange chat from working, for instance. Funny thing is, my iPad can route to those sites and they're on the same wireless network! I can ping the sites, but not traceroute to them which I find odd. That I can get through via the iPad implies the problem is with the MBP. In any case, calling M1 support is... not helpful. I get the same behaviour when I bypass the Airport Express entirely and plug the MBP directly into the cable modem. Can anybody explain a) how this is even possible and b) how to fix it? mella:~ ratkins$ ping ajax.googleapis.com PING googleapis.l.google.com (209.85.132.95): 56 data bytes 64 bytes from 209.85.132.95: icmp_seq=0 ttl=50 time=11.488 ms 64 bytes from 209.85.132.95: icmp_seq=1 ttl=53 time=13.012 ms 64 bytes from 209.85.132.95: icmp_seq=2 ttl=53 time=13.048 ms ^C --- googleapis.l.google.com ping statistics --- 3 packets transmitted, 3 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 11.488/12.516/13.048/0.727 ms mella:~ ratkins$ traceroute ajax.googleapis.com traceroute to googleapis.l.google.com (209.85.132.95), 64 hops max, 52 byte packets traceroute: sendto: No route to host 1 traceroute: wrote googleapis.l.google.com 52 chars, ret=-1 *traceroute: sendto: No route to host traceroute: wrote googleapis.l.google.com 52 chars, ret=-1 ^C mella:~ ratkins$ The traceroute from the iPad goes (and I'm copying this by hand): 10.0.1.1 119.56.34.1 172.20.8.222 172.31.253.11 202.65.245.1 202.65.245.142 209.85.243.156 72.14.233.145 209.85.132.82 From the MBP, I can't traceroute to any of the IPs from 172.20.8.222 onwards. [For extra flavour, not being able to access the above appears to stop me logging in to Server Fault via OpenID and formatting the above traceroutes correctly. Anyone with sufficient rep here to do so, I'd be much obliged.]

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >