Search Results

Search found 806 results on 33 pages for 'webmaster'.

Page 13/33 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Getting a list of Tasks that belong to a Role from Azman

    - by Steven
    I'm using the AZROLESLib which is from the COM references "azroles 1.0 Type Library" and I am trying to create a list of the designated tasks for each role that I have currently set in my authorization manager but when I loop through the tasks for the role, I get the role name. I've looked all around but couldn't find anything that would help. Here's what I got currently (It's not super pretty but i'm just trying to get it working at the moment). AzAuthorizationStoreClass AzManStore = new AzAuthorizationStoreClass(); AzManStore.Initialize(0, ConfigurationManager.ConnectionStrings["AzManStore"].ConnectionString, null); IAzApplication azApp = AzManStore.OpenApplication("StoreName", null); StringBuilder output = new StringBuilder(); Array tasks = null; foreach (IAzRole currentRole in azApp.Roles) { output.Append(currentRole.Name + "<br />"); tasks = (Array)currentRole.Tasks; foreach (object ob in tasks) { output.Append("&nbsp;&nbsp; -" + ob.ToString() + "<br />"); } } return output.ToString(); What comes out is: Administrator -Administrator Account Manager -Account Manager Corporate Marketing Specialist -Corporate Marketing Specialist General Employee -General Employee Marketing Manager -Marketing Manager Regional Marketing Specialist -Regional Marketing Specialist Sales Manager -Sales Manager Webmaster -Webmaster but what should come out is something like: Webmaster Websites Maintain News Maintain Events Maintain Reports Read Thanks in advance.

    Read the article

  • Tools for managing code deployment/versioning for IIS / Windows enviroments

    - by RizwanK
    I've got a strong background in Linux and OSX, and just left a job where I was architecting systems based on those platforms. Now I've got a Windows Server running IIS that has a number of different websites that it hosts. Most of them are just a bunch of HTML, JS and Images, with some ASP for some customer tools. (Each website has a different set of customer tools, or they are the same tools, but with minor code changes between them.) I'm also adding a develop web server with the same code, but the 'bleeding edge' stuff. I need an effective way of managing changes and updates to the overall codebase (henceforth referring to both the images and the html and the asp, for all the sites). When a dev (or webmaster) checks in changes, I want it to show up automatically on the developer server, but should be manually pushed out to the live server. I'd be tempted to just make the websites SVN repositories, but I'd be concerned about the overhead of having the webdeveloper having to log into the server and trigger an SVN update via commandline/tortise (and heaven forbid, manage tags). Ideally I'd also manage IIS profile settings between the systems, but the major need is to be able to manage the process, and expose it to our ASP developer, and our webmaster, both of which are used to just FTPing up the files to the live site. So, any recommendations on tools (beyond some SVN hacking with BAT files + teaching the webmaster how to log into the server and do updates) or workflows that would help this out? I even considered an RPM type package (or some Windows equivalent, of course) to manage the live server, but that seems like a bit too much overhead. Thanks.

    Read the article

  • Allowing creation/modification of virtual aliases using web.config

    - by user25018
    Hi, I've been given a problem to fix, and I initially thought of .htaccess files, except for one thing, I quickly realized it's an IIS server. Is it possible to allow a webmaster the ability to modify the virtual directories using web.config files in the same way you can using .htaccess files? If so, any ideas on where I can find details on how this is done that I can communicate with the end client? We want to be able to do this without having to provide access to the IIS console to the webmaster. An example of the desired change is: http://FQDN/Careers/Careers.aspx?locale=en-ca&uid=Careers have http:FQDN/careers point to the above, but modified/added/removed by the end user using web.config

    Read the article

  • Centos Virtual host loading default page

    - by ntechi
    I have asked a question which was related to this but not same, I have a centos VPS, which has two wordpress websites, one is mbas.co.in and another is onlinemba123.com, now for virtual hosting using just ONE IP ADDRESS, first I started mbas.co.in, which is working fine, when I added onlinemba123.com then, it is loading default Centos page instead of my website, and I am just testing my onlinemba123 website, I haven't configured DNS for it, I am testing it through editing my PC's hosts file, My website folder names are also same as in the conf file below Now my question is how can I load my website instead of Default page, is my virtual host config fine? My virtual host config: NameVirtualHost *:80 <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /var/www/html/www.mbas.co.in ServerName mbas.co.in ErrorLog logs/mbas.co.in-error_log CustomLog logs/mbas.co.in-access_log common </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /var/www/html/www.onlinemba123.com ServerName www.onlinemba123.com ErrorLog logs/onlinemba123-error_log CustomLog logs/onlinemba123-access_log common </VirtualHost> My computers host file is: xx.xxx.xxx.xxx www.onlinemba123.com

    Read the article

  • Homepage 301 Redirect to SSL Homepage

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have 1 site where we implemented a 301 redirect on the homepage from http to https. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our webmaster account I notice that we are not being provided with any webmaster information (search queries, backlinks) related to our homepage under SSL. I performed a Fetch Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried that the fact that Google Fetch is not getting the correct Title Tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the SiteMap to ensure that Google is correctly indexing all our pages and being able to flow from the https to the http without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • How to fix some damages from site hack?

    - by Towhid
    My site had been hacked. I found vulnerability,fixed it, and removed shell scripts. But hacker had uploaded thousands of web pages on my web server. after I removed those pages I got over 4 thousand "Not Found" Pages on my site(All linked from an external free domain and host which is removed now). Also hundreds of Keywords had been added to my site. after 3 weeks I can still see keywords from removed pages on my Google Webmaster Tools. I had 1st result on google search for certain keywords but now I am on 3rd page for the same keywords. 50% of my traffic was from google which is now reduced to 6%. How can I fix both those "Not Found" pages problem and new useless keywords? and Will it be enough to get me back on first result on google? P.S: 1)Both vulnerability and uploaded files are certainly removed. 2)My site is not infected, checked on google webmaster and a few other security web scan tools. 3) all files had been uploaded on one directory so i got something like site.com/hacked/page1.html and site.com/hacked/webpage2.html

    Read the article

  • whats the point of @localhost entries for a mail server

    - by radman
    Hi, After recently setting up a mail server I am now adding the users that I need. As part of the tutorial I followed I created root@localhost as an account and also a bunch of aliases (postmaster@localhost, webmaster@localhost etc). What is the point of having al these localhost addresses? it seems that no one can ever mail them directly... Also I am curious as to what targets I should include on my domain (like postmaster, root, webmaster etc) and what ramifications there might be for doing so?

    Read the article

  • cannot get apache2 redirect working for a site

    - by benson
    what i want to do is to redirect all visitors going to example.com to www.example.com.it seems a very common task but for some reason it is not working for this specific site .it always points to the default one. And strangely, if i replace the domain with another one(yyyyy.com and www.yyyyy.com), it works all right.i check my DNS,and it's resolved to the right IP. here's my virtual host configure: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/html/example.com Servername www.example.com <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /var/www/html/example.com> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> </VirtualHost > <VirtualHost *:80> ServerAdmin webmaster@localhost Servername example.com Redirect 301 / http://www.example.com </VirtualHost>

    Read the article

  • How to make local drive available in apache localhost

    - by Ronald Allan
    How can I make my "Drive D:" "Drive E" available in localhost. I'm running apache on my backtrack machine. My default is /var/www/. Every directory I created inside the /var/www/ is available and all working fine. Let's say I created /var/www/PENTEST/ the contents of that PENTEST directory can be accessed through: localhost/PENTEST/ How can I make this work: localhost/media/DATA/ The /media/DATA/ is my DRIVE D: I edited this: ServerAdmin webmaster@localhost DocumentRoot /media/DATA/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /media/DATA/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> Still not working. I'm getting 404. # # I figured it out. Thank for the post of "RiggsFolly" which can be found here: http://forum.wampserver.com/read.php?2,89163. I just have to change this: ServerAdmin webmaster@localhost DocumentRoot /media/DATA/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /media/DATA/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> Into this: ServerAdmin webmaster@localhost DocumentRoot D:/media/DATA/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory D:/media/DATA/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory>

    Read the article

  • I run Webmin and I want it to be accessed with two URLs, both using proxypass in apache

    - by user36644
    This is what I am trying to do: NameVirtualHost * <VirtualHost *> ServerName testsite.org ServerAdmin [email protected] DocumentRoot /var/www/ </VirtualHost> <VirtualHost *> ServerName panel.testsite.org ProxyPass / http://panel.testsite.org:10000/ ProxyPassReverse / http://panel.testsite.org:10000/ </VirtualHost> <VirtualHost 12.34.56.78> ServerName newsite.com ServerAdmin [email protected] DocumentRoot /var/newsite/ </VirtualHost> <VirtualHost 12.34.56.78> ServerName panel.newsite.com ProxyPass / http://panel.newsite.com:10000/ ProxyPassReverse / http://panel.newsite.com:10000/ </VirtualHost> The problem is that it won't accept the 2nd vhost with the IP 12.34.56.78 because it says one already exists. panel.newsite.com and newsite.com have the same IP...so I am not sure how I can make it so that only the URL "panel.newsite.com" will get proxypassed to port 10000 but no other URL on newsite.com

    Read the article

  • Wordpress blog penalized by Google search - what's wrong?

    - by pawelbrodzinski
    I have a blog (http://blog.brodzinski.com), which is wordpress.org blog with pretty popular Thesis theme with almost no other customizations. Some time ago it was penalized by Google search - it simply stopped appearing in search results even for search terms it used to be top result, like my name - Pawel Brodzinski - which isn't anything close to popular search term. To be exact the site has been penalized on Nov 18. It started popping up in search result on Dec 23 but only for a few days. Since Dec 27 it is out again. I know Google guidelines and I'm not aware to break any of them. I submitted reconsideration request after I noticed penalty. It was proceeded and there was no change whatsoever (no surprise as it seems the site was penalized again). I checked diagnostics in webmaster tools and neither any malware was detected nor any strange search terms popped up. I read related threads on Google webmasters forum but found none of solutions working for me. I posted a thread on Google webmasters forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=546339f49d4a03bc&hl=en) and the only answer I got was to check for duplicate content. Well, there is some duplicate content published on the web but it is true for vast majority of blogs and it doesn't seem to be a reason for a penalty. Also before Dec 27 I was able to remove duplicate content from a couple of sites which were republishing my feed but this doesn't change the situation - the site was penalized again. The problem is I have no idea what can be wrong with the website or how to find it out. To make the problem worse I'm no webmaster, I just run a wordpress blog, which supposed to be easy.

    Read the article

  • Apache virtual host documentroot in other folders

    - by giuseppe
    I am trying to set up a couple ov VritualHost in my Apache, but I would like to put the DocumentRoot of these virtual host on folders outside the basic www folder. It happens that I get alwasy "Permission Denied". My httpd.conf follows: NameVirtualHost *:80 <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /home/giuseppe/www ServerName www.example.com/www ErrorLog logs/host.www.projects-error_log CustomLog logs/dummy-host.example.com-access_log common <Directory "/home/giuseppe/www"> Options Indexes FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /home/developper ServerName www.example.com ErrorLog logs/host.developper-error_log CustomLog logs/dummy-host.example.com-access_log common </VirtualHost>

    Read the article

  • How do I find information on who links to my sites?

    - by bobdobbs
    I'm trying to figure out if there's a free way to get information on backlinks to my site. I've had webmaster tools and google analytics set up for years. But I can't find access to data about site backlinks in either toolset. Webmaster tools, under 'traffic'-'links to your site' gives me the same message for all of my sites: "No data available". I haven't been able to find anything in GA that gives any information on backlinks. I've heard of using "links:" as an operator in google search, but for each of my sites, this returns either zero or very few results in cases when I know I have many backlinks. Most of the links simple aren't shown. My thinking is that google maintains a graph of who links to my site, so I figured that they might let me see it. But I can't figure out how. I've found this tool on a spammy website: http://www.backlinkwatch.com. It offers more data than google on my backlines, and offers more results in exchange for a paid subscription. The data it offers for free looks good, but the results are limited and the site has popups and obnoxious ads. So, in short: how do I get data on who links to me? Is there a free way?

    Read the article

  • How to clear unshown OSD notifications

    - by Avatar Parto
    I am a webmaster of 2 active websites. Every time I start Thunderbird, it downloads all the new messages from the admin mail addresses of these sites - [email protected], [email protected], [email protected], etc. There are about 20 of them - this includes my personal email accounts. Now, am not always connected to the internet and there arises cases where I need to access already downloaded mails. Thunderbird then tries to connect to the internet, which it can't coz am offline, and then displays about twenty OSD notifications, one after the other, telling me that connection for each email address has failed. That's frustrating and it really eating me up. Am looking for a way, either: 1. To tell Thunderbird NOT to connect to the internet when am offline, or 2. To clear all these notifications after like the first one shows. And if it will help, am using ubuntu 13.04 and Thunderbird 24.0 and I always keep my computer fully updated. Any help is welcome. Thanxs.

    Read the article

  • Blogger homepage won't update!

    - by Sims Siniron
    i am new on blogging webmaster Tools. When i usually add new post to my blog, it will automatic updated my homepage also. But from last 14th January, my homepage won't update by Google SEPR. As a result i am losing my popularity on SEPR. Previously when i post new article, 70-80% will go to the first page result. But after the problem occurs, none of them reach in top 15page of Google SEPR :( Last 1/12/12, Google webmaster sent me a "Notice of DMCA removal from Google Search" massage to indicates one of my URL that contains some infringing content which i deleted after receiving their notice. Not only that, i also cheeked all of my posts if there any additional infringing content available. After removing that, i fill out Google's Content Removed Notification form to notify them and Google also sent me a feedback that they received it and suggest "In the future, if you have removed the allegedly infringing content from your site (and won’t put it back), please use the correct form" which also i filled up. Now my question is that, Is everything alright which i did before? Although my new posts are indexed in GSEPR with ".." but why Google Robots.txt won't update my homepage which previously automatically updated when a new article was published.

    Read the article

  • Apache2 Virtual Host broken; displayed default index.html on subdomain, but correct content on www.subdomain

    - by Robert K
    I've got a Linode configured as a Ubuntu 10.04.2 web server with Apache 2.2.14. I have a total of 4 sites, all defined under /etc/apache2/sites-available as virtual hosts. All sites are almost identical clones for configuration. And all sites but my last work successfully. default: (www.)exampleadnetwork.com (www.)example.com reseller.example.com trouble: client1.example.com I keep getting this page when I visit the client1.example.com site: It works! This is the default web page for this server. The web server software is running but no content has been added, yet. In my ports.conf file I have the NameVirtualHost correctly set to my IP address on port 80. If I access the "www.sub.example.com" alias the site works! If I access it without the www I see the "It Works" excerpt posted above. Even apache2ctl -S shows that my vhost file parses correctly and is added to the mix. My vhost configuration file is as follows: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName client1.example.com ServerAlias client1.example.com www.client1.example.com DocumentRoot /srv/www/client1.example.com/public_html/ ErrorLog /srv/www/client1.example.com/logs/error.log CustomLog /srv/www/client1.example.com/logs/access.log combined <directory /srv/www/client1.example.com/public_html/> Options -Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </directory> </VirtualHost> The other sites are variations of: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName example.com ServerAlias example.com www.example.com DocumentRoot /srv/www/example.com/public_html/ ErrorLog /srv/www/example.com/logs/error.log CustomLog /srv/www/example.com/logs/access.log combined </VirtualHost> The only site the differs is the other subdomain: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName reseller.example.com ServerAlias reseller.example.com DocumentRoot /srv/www/reseller.example.com/public_html/ ErrorLog /srv/www/reseller.example.com/logs/error.log CustomLog /srv/www/reseller.example.com/logs/access.log combined </VirtualHost> Filenames are the FQDN without the www. prefix. I've followed this advice, but still cannot access subdomain properly.

    Read the article

  • Phishing site uses subdomain that I never registered

    - by gotgenes
    I recently received the following message from Google Webmaster Tools: Dear site owner or webmaster of http://gotgenes.com/, [...] Below are one or more example URLs on your site which may be part of a phishing attack: http://repair.gotgenes.com/~elmsa/.your-account.php [...] What I don't understand is that I never had a subdomain repair.gotgenes.com, but visiting it in the web browser gives an actual My DNS is FreeDNS, which does not list a repair subdomain. My domain name is registered with GoDaddy, and the nameservers are correctly set to NS1.AFRAID.ORG, NS2.AFRAID.ORG, NS3.AFRAID.ORG, and NS4.AFRAID.ORG. I have the following questions: Where is repair.gotgenes.com actually registered? How was it registered? What action can I take to have it removed from DNSs? How can I prevent this from happening in the future? This is pretty disconcerting; I feel like my domain has been hijacked. Any help would be much appreciated.

    Read the article

  • Google penalty recovery

    - by sajeev
    I have a site, which is a spiritual site and has nothing to do with commercial or business thing. It was in the first page with the keyword RADHANATH SWAMI. Then suddenly, in sep'13, the site went out from google search. When I checked my WebmasterTools, I saw that there were 40,000 backlinks from a site named http://freeguestbooks.net. But manual action section in WebmasterToolssaid "No Manual webspam actions found" and till date, it shows the same. Then I contacted the webmaster of freeguestbooks.net and requested him to remove the 40K odd backlinks and he very kindly removed it. The WebmasterTools now shows 21,777 backlinks from this site. But since 2 months, the decrease rate of these backlinks is very slow, almost zero. Again, I had contacted the webmaster of freeguestbooks.net and he confirmed that there are no backlinks pointing to his site. I am also told that my anchor text is over-optimised... Total backlinks to my site as per WebmasterTools is only 24,937 out of which 21,777 are the links from freeguestbooks.net as shown in WebmasterTools. Could an expert suggest a way out to get back into google.... Sajeev, India

    Read the article

  • Conditional cPanel Forwarder

    - by Wireblue
    We have many clients on a cPanel server, some of which also have SSL certificates setup. Each year when the renewals are sent to webmaster@ their domain we would like a copy of those emails so we can install the SSL certifciate for them and issue invoices etc. So I'm wondering if there is a way we can selectively or conditionally forward certain emails if they match certain rules? (in cPanel) I'm thinking: If an email is sent to "webmaster@domain" and the subject contains "ssl", then forward to "[email protected]". Any ideas? Thanks in advance, Wireblue

    Read the article

  • WHY Google does not ban these sites using this SEO pattern? [on hold]

    - by saddam.bg
    I have seen some sites using a different kind of SEO to promote copyrighted materials such as movies. They also have submitted their site to Google webmaster tools but still now did not get banned. Their Alexa ranks are 7000 or less. On the other hand I have run 5 movie affiliate sites and all of them got banned by Google within a short period of time. I have copied the url of the homepage of solarmovie.me and pasted it on the google search and instead of the homepage url I have seen that their category or tag shows as the homepage (www.solarmovie.me/watch-category/hollyw... Now is solarmovie.me publishing its posts as a single page or something else? I tried to find out what kind of SEO or coding that was, but I couldn't since I have very little knowledge about coding. Also I have seen the same thing with ALLUC.TO in google search (www.alluc.to/popular-links.html). Could anyone please help with the SEO of this kind so that I don't get banned by google frequently or index removed. All SEO webmaster i need your help!!!! Please give me some good tips for this type of SEO. Thank You Very Much

    Read the article

  • Virtual host in Apache Zend

    - by llocani
    I'd like to ask you if you can tell me why I can't get Vhost in Apache to work my Vhostconf is: NameVirtualHost *:80 <VirtualHost _default_:80> ServerAdmin [email protected] DocumentRoot "E:/Archivos de programa/Zend/Apache2/htdocs" ServerName localhost <Directory "E:/Archivos de programa/Zend/Apache2/htdocs"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> #AllowOveride all </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/HealingHands" ServerName healinghands.loc <Directory "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/HealingHands"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> ErrorLog "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/HealingHands/logs/error.log" CustomLog "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/HealingHands/logs/access.log" common #AllowOveride all </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects" ServerName dev.loc <Directory "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> ErrorLog "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/logs/error.log" CustomLog "E:/Documents and Settings/dvieira/Mis documentos/NetBeansProjects/logs/access.log" common #AllowOveride all </VirtualHost> My httpd.conf is: ServerRoot "E:\Archivos de programa\Zend\Apache2" Listen 80 LoadModule actions_module modules/mod_actions.so LoadModule alias_module modules/mod_alias.so LoadModule asis_module modules/mod_asis.so LoadModule auth_basic_module modules/mod_auth_basic.so LoadModule auth_digest_module modules/mod_auth_digest.so LoadModule authn_default_module modules/mod_authn_default.so LoadModule authn_file_module modules/mod_authn_file.so LoadModule authz_default_module modules/mod_authz_default.so LoadModule authz_groupfile_module modules/mod_authz_groupfile.so LoadModule authz_host_module modules/mod_authz_host.so LoadModule authz_user_module modules/mod_authz_user.so LoadModule autoindex_module modules/mod_autoindex.so LoadModule cgi_module modules/mod_cgi.so LoadModule dir_module modules/mod_dir.so LoadModule env_module modules/mod_env.so LoadModule filter_module modules/mod_filter.so LoadModule headers_module modules/mod_headers.so LoadModule imagemap_module modules/mod_imagemap.so LoadModule include_module modules/mod_include.so LoadModule info_module modules/mod_info.so LoadModule isapi_module modules/mod_isapi.so LoadModule log_config_module modules/mod_log_config.so LoadModule mime_module modules/mod_mime.so LoadModule mime_magic_module modules/mod_mime_magic.so LoadModule negotiation_module modules/mod_negotiation.so LoadModule rewrite_module modules/mod_rewrite.so LoadModule setenvif_module modules/mod_setenvif.so LoadModule ssl_module modules/mod_ssl.so LoadModule status_module modules/mod_status.so LoadModule userdir_module modules/mod_userdir.so <IfModule !mpm_netware_module> <IfModule !mpm_winnt_module> User daemon Group daemon </IfModule> </IfModule> ServerAdmin [email protected] DocumentRoot "E:\Archivos de programa\Zend\Apache2/htdocs" <Directory /> Options FollowSymLinks AllowOverride all Order allow,deny Allow from all </Directory> <IfModule dir_module> DirectoryIndex index.php index.html home.php </IfModule> <FilesMatch "^\.ht"> Order allow,deny Deny from all Satisfy All </FilesMatch> ErrorLog "logs/error.log" LogLevel warn <IfModule log_config_module> LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined LogFormat "%h %l %u %t \"%r\" %>s %b" common <IfModule logio_module> LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %I %O" combinedio </IfModule> CustomLog "logs/access.log" common </IfModule> <IfModule alias_module> Alias /NetBeansProjects "E:\Documents and Settings\dvieira\Mis documentos\NetBeansProjects" ScriptAlias /cgi-bin/ "E:\Archivos de programa\Zend\Apache2/cgi-bin/" </IfModule> <IfModule cgid_module> </IfModule> <Directory "E:\Archivos de programa\Zend\Apache2/cgi-bin"> AllowOverride None Options None Order allow,deny Allow from all </Directory> DefaultType text/plain <IfModule mime_module> TypesConfig conf/mime.types AddType application/x-compress .Z AddType application/x-gzip .gz .tgz </IfModule> Include conf/extra/httpd-vhosts.conf <IfModule ssl_module> SSLRandomSeed startup builtin SSLRandomSeed connect builtin </IfModule> Include "conf/zend.conf" NameVirtualHost *:80 <VirtualHost *:80> Include "E:\Archivos de programa\Zend\ZendServer/etc/sites.d/zend-default-vhost-80.conf" </VirtualHost> Include "E:\Archivos de programa\Zend\ZendServer/etc/sites.d/globals-*.conf" Include "E:\Archivos de programa\Zend\ZendServer/etc/sites.d/vhost_*.conf" And my host in Windows: 127.0.0.1 localhost 127.0.0.1 healinghands.loc 127.0.0.1 dev.loc And I can't get any of the browser to recognize dev.loc or healinghands.loc but a ping does it. Localhost is working fine. I've spent 3 days now traying to solve this for my one but I finally quit and have to ask. The error should be this Error Code 11002: host not found. Background: this error indicates that the gateway could not find an authoritative DNS server for the website you are trying to access. Date: 5/20/2013 5:51:03 PM Server: Source: DNS problem. i'd like to add this ping Haciendo ping a healinghands.loc [127.0.0.1] con 32 bytes de datos: Respuesta desde 127.0.0.1: bytes=32 tiempo<1m TTL=128 Respuesta desde 127.0.0.1: bytes=32 tiempo<1m TTL=128 Respuesta desde 127.0.0.1: bytes=32 tiempo<1m TTL=128 Respuesta desde 127.0.0.1: bytes=32 tiempo<1m TTL=128 Estadísticas de ping para 127.0.0.1: Paquetes: enviados = 4, recibidos = 4, perdidos = 0 (0% perdidos), Tiempos aproximados de ida y vuelta en milisegundos: Mínimo = 0ms, Máximo = 0ms, Media = 0ms Today i've tryed something: i've add this domains into the exceptions of mi ie proxy config. This worked for healinghands.loc but not for dev.loc i really do not understand why, both config are exactly the same except for de documentroot. I will continue searching

    Read the article

  • Google Sitemap and Robots.txt Issue

    - by Sarfaraz Soomro
    Hi, We have a sitemap at our site, http://www.gamezebo.com/sitemap.xml Some of the urls in the sitemap, are being reported in the webmaster central as being blocked by our robots.txt, see, gamezebo.com/robots.txt ! Although these urls are not Disallowed in Robots.txt. There are other such urls aswell, for example, gamezebo.com/gamelinks is present in our sitemap, but it's being reported as "URL restricted by robots.txt". Also I have this parse result in the Webmaster Central that says, "Line 21: Crawl-delay: 10 Rule ignored by Googlebot". What does it mean? I appreciate your help, Thanks.

    Read the article

  • Java webapp: where/how to automatically set each picture's width/height

    - by NoozNooz42
    For several reasons, a lot of "webmaster guides" (like Google and Yahoo!'s webmaster guides/guidelines) repeats several times that it is better to always put the width and height attribute of the img tag. One of the most obvious reason is that the elements in the page won't seem to be "jumping around" to a new location after every picture is loaded (always setting the correct width/height sure gets rid of this behavior). And there are other reasons to follow these guidelines / best practices. So: if we consider that these are indeed good practices if there are a lot of pictures and they are changing often if pictures aren't changing between two .war re-deploy (that is: there's no user-contributed picture) if we don't want to manually edit all these width/height attributes How do we automatically/programmatically serve HTML pages where every img tag have their width/height attribute correctly set as the best practice recommend?

    Read the article

  • Setting up apache to view https pages

    - by zac
    I am trying to set up a site using vmware workstation, ubuntu 11.10, and apache2. The site works fine but now the https pages are not showing up. For example if I try to go to https://www.mysite.com/checkout I just see the message Not Found The requested URL /checkout/ was not found on this server. I dont really know what I am doing and have tried a lot of things to get the ssl certificates in there right. A few things I have in there, in my httpd.conf I just have : ServerName localhost In my ports.conf I have : NameVirtualHost *:80 Listen 80 <IfModule mod_ssl.c> # If you add NameVirtualHost *:443 here, you will also have to change # the VirtualHost statement in /etc/apache2/sites-available/default-ssl # to <VirtualHost *:443> # Server Name Indication for SSL named virtual hosts is currently not # supported by MSIE on Windows XP. Listen 443 http </IfModule> <IfModule mod_gnutls.c> Listen 443 http </IfModule> In the /etc/apache2/sites-available/default-ssl : <IfModule mod_ssl.c> <VirtualHost _default_:443> ServerAdmin webmaster@localhost DocumentRoot /var/www <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> .... truncated in the sites-available/default I have : <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www #DocumentRoot /home/magento/site/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> #<Directory /home/magento/site/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> <virtualhost *:443> SSLEngine on SSLCertificateFile /etc/apache2/ssl/server.crt SSLCertificateKeyFile /etc/apache2/ssl/server.key ServerAdmin webmaster@localhost <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> #<Directory /home/magento/site/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> </virtualhost> I also have in sites-availabe a file setup for my site url, www.mysite.com so in /etc/apache2/sites-available/mysite.com <VirtualHost *:80> ServerName mysite.com DocumentRoot /home/magento/mysite.com <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /home/magento/mysite.com/ > Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog /home/magento/logs/apache.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn </VirtualHost> <VirtualHost *:443> ServerName mysite.com DocumentRoot /home/magento/mysite.com <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /home/magento/mysite.com/ > Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog /home/magento/logs/apache.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn </VirtualHost> Thanks for any help getting this setup! As is probably obvious from this post I am pretty lost at this point.

    Read the article

  • httpd.conf configuration - for internal/external access

    - by tom smith
    hey. after a lot of trail/error/research, i've decided to post here in the hopes that i can get clarification on what i've screwed up... i've got a situation where i have multiple servers behind a router/firewall. i want to be able to access the sites i have from an internal and external url/address, and get the same site. i have to use portforwarding on the router, so i need to be able to use proxyreverse to redirect the user to the approriate server, running the apache/web app... my setup the external urls joomla.gotdns.com forge.gotdns.com both of these point to my router's external ip address (67.168.2.2) (not really) the router forwards port 80 to my server lserver6 192.168.1.56 lserver6 - 192.168.1.56 lserver9 - 192.168.1.59 lserver6 - joomla app lserver9 - forge app i want to be able to have the httpd process (httpd.conf) configured on lserver6 to be able to allow external users accessing the system (foo.gotdns.com) be able to access the joomla app on lserver6 and the same for the forge app running on lserver9 at the same time, i would also like to be able to access the apps from the internal servers, so i'd need to be able to somehow configure the vhost setup/proxyreverse setup to handle the internal access... i've tried setting up multiple vhosts with no luck.. i've looked at the different examples online.. so there must be something subtle that i'm missing... the section of my httpd.conf file that deals with the vhost is below... if there's something else that's needed, let me know and i can post it as well.. thanks -tom ##joomla - file /etc/httpd/conf.d/joomla.conf Alias /joomla /var/www/html/joomla <Directory /var/www/html/joomla> </Directory> # Use name-based virtual hosting. #NameVirtualHost *:80 # NOTE: NameVirtualHost cannot be used without a port specifier # (e.g. :80) if mod_ssl is being used, due to the nature of the # SSL protocol. # VirtualHost example: # Almost any Apache directive may go into a VirtualHost container. # The first VirtualHost section is used for requests without a known # server name. #<VirtualHost *:80> # ServerAdmin [email protected] # DocumentRoot /www/docs/dummy-host.example.com # ServerName dummy-host.example.com # ErrorLog logs/dummy-host.example.com-error_log # CustomLog logs/dummy-host.example.com-access_log common #</VirtualHost> NameVirtualHost 192.168.1.56:80 <VirtualHost 192.168.1.56:80> #ServerAdmin [email protected] #DocumentRoot /var/www/html #ServerName lserver6.tmesa.com #ServerName fforge.tmesa.com ServerName fforge.gotdns.com:80 #ErrorLog logs/dummy-host.example.com-error_log #CustomLog logs/dummy-host.example.com-access_log common #ProxyRequests Off ProxyPass / http://192.168.1.81:80/ ProxyPassReverse / http://192.168.1.81:80/ </VirtualHost> <VirtualHost 192.168.1.56:80> #ServerAdmin [email protected] DocumentRoot /var/www/html/joomla #ServerName lserver6.tmesa.com #ServerName fforge.tmesa.com ServerName 192.168.1.56:80 #ErrorLog logs/dummy-host.example.com-error_log #CustomLog logs/dummy-host.example.com-access_log common #ProxyRequests Off </VirtualHost>

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >