Search Results

Search found 21481 results on 860 pages for 'www yegorov p ru'.

Page 113/860 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • CISCO WS-C4948-10GE SFP+?

    - by Brian Lovett
    I have a pair of CISCO WS-C4948-10GE's that we need to connect to a new switch that has SFP+ and QSFP ports. Is there an X2 module that supports this? If so, can someone name the part number that will work? I have found some information, but want to make sure I have the correct part number for our exact switches. Per the discussion in the comments, I believe I have a better understanding of things now. Would I be correct in saying that I need these SR modules on the cisco side: [url]http://www.ebay.com/itm/Cisco-original-used-X2-10GB-SR-V02-/281228948970?pt=LH_DefaultDomain_0&hash=item417a8d35ea[/url] Then, on the switch with sfp+ ports, I can pick up an SR to SFP+ transceiver like this: [url]http://www.advantageoptics.com/SFP-10G-SR_lp.html?gclid=CKP4s-G27b4CFXQiMgodLD8AQA[/url] and finally, an SR calbe such as this: [url]http://www.colfaxdirect.com/store/pc/viewPrd.asp?idproduct=1551[/url] Am I on the right track here?

    Read the article

  • Is defragging right for me?

    - by blade
    Hi, I am using Hyper-V on my Windows Server 2008 R2 DC x64 machine. I am also using standard SATA drives. I read some threads on here about defraging but could not reach a conclusion about whether or not I should use defragging. Can anyone shed some light on whether this will be right for me? Furthermore, what tool is best? There seems to be 3: http://www.perfectdisk.com/products/business-perfectdisk11-server/key-features http://www.diskeeper.com/diskeeper/home/server-edition.aspx?id=40279&wid=7 http://www.perfectdisk.com/products/business-perfectdisk11-hyper-v/learn-more Anyone have experience with this?

    Read the article

  • How to add command line arguments to command line arguments in Windows shortcut?

    - by Pawin
    I know I can add a command line argument/option to a shortcut this way; for example: "C:\Program Files\Internet Explorer\iexplore.exe" www.a.com So IE will connect to a.com when it starts up. What I would like to do is to get IE connecting to a.com when I call it through another program like the following: C:\Windows\SysWOW64\ForceBindIP.exe 192.168.1.151 "C:\Program Files\Internet Explorer\iexplore.exe" www.a.com This does not work. IE starts up but doesn't go to a.com. It seems like the argument is either ignored or is understood as an argument of ForceBindIP instead (I'm not sure). What I am trying to do is to create 2 IE shortcuts such each of them binds one IE window to one NIC and one particular website. So adding the www.a.com etc in its startup list won't help. OS is Windows 8. Apologize if this has been asked and answered before. Please suggest keywords for searching if that's the case.

    Read the article

  • One site being on a subdirectory of another. Does google count this against you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory. EDIT: Just in case its important... say the popular site is called pop.com and the unpopular one unpop.com. In the webspace I've purchased, there is a directory called public_html. This is where I put the index.htm and all the other files of my popular site. When I purchased the add-on unpop.com. I made a subdirectory of public_html called unpop. It is within this "public_html\unpop\" that I place the index.htm and all the other files of my unpopular site. Typing www.unpop.com into the address bar of a browser links directly to the contents of "public_html\unpop\" and the user is not aware that this site is sitting on a subdirectory of another site. BUT if you type "www.pop.com/unpop" into the address bar of a browser you DO see the unpopular site.

    Read the article

  • Why is group write permission ignored in Ubuntu?

    - by NorthPole
    I want my user to have full access to the local Apache root folder, and I also want the Apache user to have full access to the same folder. What I did was create a new group called DevGroup and I added my user and www-data there. Also I changed the permissions to 770 to allow full group access. But now it won't allow me or the Apache user any kind of access to the folder. Here is what I get with ls: drwxrwx--- 12 root DevGroup 4096 Sep 27 17:34 testFolder Which seems perfect but when I try as a user to access the file I get this: var/www$ ls testFolder/ ls: cannot open directory testFolder/: Permission denied Also when I try to access a page in the folder from a browser: [Thu Sep 27 17:47:16 2012] [error] [client 127.0.0.1] PHP Fatal error: Unknown: Failed opening required '/var/www/testFolder/foo.php' (include_path='.:/usr/share/php:/usr/share/pear') in Unknown on line 0 What's the problem and how can I fix it?

    Read the article

  • Best solution for getting referral information in PHP

    - by absentx
    I am currently redoing some link structuring on a website. In the past we have used specific php files on the last step to direct the user to the proper place. Example: www.mysite.com/action/go-to-blue.php or www.mysite.com/action/short/go-to-red.php www.mysite.com/action/tall/go-to-red.php We are now restructuring to eliminate the /short/ or /tall/ directory. What this means is now "go-to-blue.php" will be doing some extra processing to make sure it sends the visitor to the proper place. The static method of the past was quite effective, because, well, if they left from that page we knew we had it right. Now since we are 301 redirecting action/short/go-to-red.php to just action/go-to-red.php it is quite important on "go-to-red.php" that we realize a user may have been redirected from /short/ or /tall/. So right now I am using HTTP_REFERRER and of course in my testing that works fine, but after a lot of reading it is clear that this is not a solid solution, so I was starting to brainstorm on other ways to check and make sure we get the proper referral information. If we could check HTTP_REFERRER plus some other test, I would feel confident we have a pretty good system in place to send the visitor to the right place. Some questions/comments: Could I use a session variable or a cookie to accomplish this goal? If so, would that be maintained through the 301 redirect? I don't see why it wouldn't be.. Passing the url in the url is not an option in this case.

    Read the article

  • How to write a blog for SEO purpose

    - by Mathieu Imbert
    I have a photo sharing website, which provides very little textual content. Users can add tags to photos and a description, but it creates a lot of duplicate content, because most of the descriptions will be 'wow', 'lol', ... I don't think I should rely on users to build my SEO. I think it would be a great idea to write a blog, and use it to describe the best photos, start contests, explain themes, in short: create original content that search engines will love. Our website's main URL is like www.domain.com, and our new blog is hosted on blog.domain.com. From a SEO perspective, is it a good idea to keep the blog separate from the main site? This has the advantage to leave the original site unchanged, but will it add any page rank to the www.domain.com? If the blog ranks well it will obviously pass some page rank to the original through links. What do you think is the best option from a SEO perspective? Include the blog in www.domain.com? Or leave it in blog.domain.com?

    Read the article

  • ArchBeat Link-o-Rama for 2012-04-12

    - by Bob Rhubart
    2012 Real World Performance Tour Dates |Performance Tuning | Performance Engineering www.ioug.org Coming to your town: a full day of real world database performance with Tom Kyte, Andrew Holdsworth, and Graham Wood. Rochester, NY - March 8 Los Angeles, CA - April 30 Orange County, CA - May 1 Redwood Shores, CA - May 3 Oracle Technology Network Developer Day: MySQL - New York www.oracle.com Wednesday, May 02, 2012 8:00 AM – 4:30 PM Grand Hyatt New York 109 East 42nd Street, Grand Central Terminal New York, NY 10017 Webcast Series: Data Warehousing Best Practices event.on24.com April 19, 2012 - Best Practices for Workload Management of a Data Warehouse on Oracle Exadata May 10, 2012 - Best Practices for Extreme Data Warehouse Performance on Oracle Exadata Webcast: Untangle Your Business with Oracle Unified SOA and Data Integration event.on24.com Date: Tuesday, April 24, 2012 Time: 10:00 AM PT / 1:00 PM ET Speakers: Mala Narasimharajan - Senior Product Marketing Manager, Oracle Data Integration, Oracle Bruce Tierney - Director of Product Marketing, Oracle SOA Suite, Oracle The Increasing Focus on Architecture (ArchBeat) blogs.oracle.com As a "third wave" of computing, Cloud computing is changing how IT organizations and individuals within those organizations approach the creation of solutions. Updated SOA Documents now available in ITSO Reference Library blogs.oracle.com Nine updated documents have just been added to the IT Strategies from Oracle library, including SOA Practitioner Guides, SOA Reference Architectures, and SOA White Papers and Data Sheets. Access to all documents within the ITSO library is free to those with a free Oracle.com membership. WebLogic JMS Clustering and Spring | Rene van Wijk middlewaremagic.com Oracle ACE Rene van Wijk sets up a WebLogic cluster that includes a JMS environment, which will be used by Spring. Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5 | Shub Lahiri blogs.oracle.com Shub Lahiri shows how the pre-installed simulator that comes with the SOA Suite for Healthcare Integration pack can be used as an external endpoint to generate inbound and outbound HL7 traffic on specified MLLP ports. In the cloud era, let's start calling IT what it is: 'Innovation Team' | Joe McKendrick www.zdnet.com Cloud, the third great shift in 50 years of computing, presents a golden opportunity for IT to get out in front and lead. Thought for the Day "Why do we never have time to do it right, but always have time to do it over?" — Anonymous

    Read the article

  • Create FTP accounts with access to just some folders in the web directory

    - by Karevan
    I own a VPS server. At the moment I havent installed any FTP server on it, I am using SSH and SFTP only. I am using Debian 6 Squeeze and Apache2 service. The web directory is in /var/www/ Well, I wanted to create different FTP accounts and give access to some people to them (one account per user). In my web directory I have an structure like this: /var/www/mtaplugins/music/mplayer/music/ /var/www/mapuploader/ and more folders inside. I want to create an FTP account which should be able to just access one of those folders and the folders inside them. I would appreciate some recomendations or stept to follow before installing anything or doing anythong, because I dont have any idea about this. I was thinking in using ProFTPd but as I saw in the documentation it would just create an account for each user in my server, and I want to not create more users (I always use root) Thanks in advance

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • make sure i've configured dns correctly

    - by user22817
    Hi, On my server with Windows Server 2008, i've added a new zone with my new domain: biografica.ro. I've added a name server for it: ns.biografica.ro and the ip of the server. Also i added two A hosts for ns.biografica.ro and www.biografica.ro. I've also defined first that name server on the site where i bought the domain and pointed it to my ip. Now, i understood that i have to wait 24-48 hours for this new domain to be available online. Question: - is this a way to check on the server that i've succesfully configured it? For instance: nslookup biografica.ro or nslookup www.biografica.ro or ping www.biografica.ro do not work locally, on the server... Can you give me an advice? Thanks.

    Read the article

  • Linux Security/Sysadmin Courses in London?

    - by mister k
    Hi, My employer has offered to send me on a couple of training courses and I'm just looking for some recommendations. I'm mainly looking to improve my security and general sysadmin skills. I would like to do something focused on UNIX as I mainly work with Linux boxes (but also a couple of FreeBSD boxes). I don't want to do a study-from-home course, so I would need to find somewhere based in London. It would be great to hear from anyone who has some experience with this kind of course. The courses I've found so far are: www.learningtree.co.uk/courses/uk433.htm www.city.ac.uk/cae/cfa/computing/systems_it/linux.html www.city.ac.uk/cae/cfa/computing/systems_it/unix_tools_ss.html I'm not sure the City University courses are advanced enough as I already have experience... Thanks!

    Read the article

  • Not allowed to upload .HTML files to my own DNN Site: Is it normal?

    - by Jake M
    My Question: Our webhost provider wont make it so we can upload .html files to our DNN site trhough the DNN File Manager page. Is that normal, should I push them to allow me to do this? We have recently transferred our website to a Dot Net Nuke run website. We originally had our website on a Linux server with Python scripts handling the backend. Obviously we now have a Windows server running .NET with ASP .NET code on the backend. Our webhost is a local Australian company. And they are saying we cant upload any .html files to the main part of the server, ie, www.ourdomain.com/Portals/0/. They are saying that the only place I can upload .html files is via FTP to this folder *www.ourdomain.com/Portals/0/html_content* This is a major problem for me because I am trying to upload my own skin which means I need to upload a main.html file to www.ourdomain.com/Portals/0/skins/myskin/ but they wont let me?! I guess what I am asking is, is this normal practice, why would they not allow this? As an experienced web admin for Linux servers and as someone who is used to being able to do whatever I want on my OWN server this is someing that really pis$%s me off!

    Read the article

  • Make IP Address point to webroot instead of virtual hosts' documentroot

    - by Reuben L.
    I used to have a one-to-one domain name and IP. Recently I've paid for a second domain name and decided to host it on the same box and IP. As such, I added virtualhosts to point each domain name to a different document root (i.e. /var/www/webbie1 and /var/www/webbie2). The question I have is, can I still make the IP, e.g. http://XXX.XXX.XXX.XXX, point to the webroot, i.e. /var/www/? If so, how do I go about doing it? For a fuller picture, the box is on an Ubuntu server OS and I'm using apache2 as the app server. the changes I made to enable to virtual hosts were in the apache2.conf file with the <VirtualHost [IP address]> ... </VirtualHost> tags. Thanks.

    Read the article

  • How do I change pages registering as 404 to 200

    - by christian
    I have this problem. After relaunching my site: http://www.kgstiles.com, traffic dropped immensely(about 60%). After troubleshooting for a week and a half - losing thousands of dollars off of lost traffic in the process, I found that Google was getting a 404 error at the end of many of my 301 redirects(so it wouldn't index the new pages). Most of of the pages, though, would register in my browser. They registered as a 404 error in Google's index as well as a 404checker. So my first question is: could this be what's causing my loss of traffic? and second: how do I fix it? I'm desperate! Any help is appreciated! # BEGIN s2Member GZIP exclusions <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{QUERY_STRING} (^|\?|&)s2member_file_download\=.+ RewriteRule .* - [E=no-gzip:1] </IfModule> # END s2Member GZIP exclusions # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^moreinfo/(.*)$ http://www.kgstiles.com/moreinfo$1 [R=301] RewriteRule ^healthsolutions/(.*)$ http://www.kgstiles.com/healthsolutions$1 [R=301] RewriteRule ^(.*)\.html$ $1/ [R=301,L] RewriteRule ^(.*)\.htm$ $1/ [R=301,L] </IfModule> # END WordPress

    Read the article

  • Update Google Sitemap for Mobile

    - by dimo414
    I have a series of utilities to generate Google sitemaps for my whole site. These files are massive, and slow to build. We want to start telling Google these pages are mobile-crawl-able too, by adding them to mobile sitemaps, but the documentation is unclear if I need to specify physically different files for my mobile URLs than for my normal ones. If this is my current sitemap: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://mobile.example.com/article100.html</loc> </url> </urlset> Can I simply change it to: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0"> <url> <loc>http://mobile.example.com/article100.html</loc> <mobile:mobile/> </url> </urlset> Or do I need to create new files with the additional markup, alongside my existing files?

    Read the article

  • Virtualhost Wildcard Subdomains

    - by Khuram
    We have one static IP on which we have routed our company website. We have setup a local machine on windows with WAMP to run our testing server. We want virtual hosts to test our different apps. However, when creating subdomains, we have a new project which uses wildcard subdomains. How can we create the wildcard subdomains in VirtualHosts. We use, NameVirtualHost * <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/corporate" ServerName companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName project.companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName *.project.companysite.com </VirtualHost> However, the last * wildcard does not work. Any help?

    Read the article

  • .html extension or no for SEO purposes

    - by Scott Schluer
    I know this question has been asked before on Stack Overflow, but what I have not been able to find in the posts I've read are concrete references as to WHY one is better than the other (something I can take to my boss). So I'm working on an MVC 3 application that is basically a rewrite of the existing production application (web forms) using MVC. The current site uses a URL rewriter to rewrite "friendly" urls with HTML extensions to their ASPX counterpart. i.e. http://www.site.com/products/18554-widget.html gets rewritten to http://www.site.com/products.aspx?id=18554 We're moving away from this with the MVC site, but the powers that be still want the HTML extension on the URLs. As a developer, that just feels wrong on an MVC site. I've written a quick and dirty HttpModule that will perform a 301 redirect from the .html URL to the same URL without the .html extension and it works fine, but I need to convince management that removing the .html extension is not going to hurt SEO. I'd prefer to have this sort of friendly URL: http://www.site.com/products/18554-widget Can anyone provide information to back up my position or am I actually trying to do something that WOULD hurt SEO, in which case can you provide references on that?

    Read the article

  • Running CORN job on Ubuntu server for SugarCRM

    - by Logik
    i am pretty inexperienced in Linux.So be descriptive on your answer. My environment :Local Linux server 12.04 hosting Sugar CRM 6.5.2. There is area in sugar CRM called scheduler. I can configured some predefined jobs here. in my case i am trying to run email reminders (ever min/hour/day/month). For this scheduler to be effective, i read some where i need to setup CRON job. So i did some research & finally put following lines in CRONTAB for the root user, as per instructions given in sugarCRM. cd /var/www/crm; php -f cron.php /dev/null 2&1 Well i am creating contracts in my sugarCRM (AOS module) & i want email reminders to be sent for these contracts to the concern person. Now my sugarCRM email is configured correctly & i can send test emails using it. But the CRON + scheduler not giving any result. I can't receive any emails. Then i tried to read /var/log/syslog & it is showing entry for following line each minute. Oct 27 15:03:01 unicomm CRON[28182]: (root) CMD (cd /var/www/crm; php -f cron.php /dev/null 2&1) I've few questions: 1) what does the CRON job line i've added in crontab mean? cd /var/www/crm; php -f cron.php /dev/null 2&1 is not making any sense to me. 2) How am i suppose to get this thing work? I've searched a lot (including SugarCRM forum), but no luck.

    Read the article

  • Plesk and Apache configuration gives me 403 on all sites

    - by Michael Stark
    My friends server running Plesk 9.2 with Apache. Now there were some problems the last days where he couldn't tell me what exactly has happened. The situation now is the following: He has a lot of domains in it. When somebody visit any domain it shows up a 403. I checked the logs and saw the problem [Sun Jun 24 08:24:47 2012] [error] [client XX.XX.XX.XX] script '/srv/www/htdocs/index.php' not found or unable to stat Apache should route to '/srv/www/vhosts/domain.tld/htdocs/index.php' instead of /srv/www/htdocs/index.php It's doing that on all of the domains. Can you tell me whats wrong and how to fix it?

    Read the article

  • Need help with an .htaccess URL rewriter

    - by AlexV
    I'm trying to do another SEO system with PHP/.htaccess... I need the following rules to apply: Must catch all URLs that do not end with an extension (www.foo.com -- catch | www.foo.com/catch-me -- catch | www.foo.com/dont-catch.me -- don't catch). Must catch all URLs that end with .php* (.php, .php4...) (thwaw are the exceptions to rule #1). All rules must only apply in some directories and not in their subdirectories (/ and /framework so far). The htaccess must send the typed URL in a GET value so I can work with it in PHP. Any mod-rewrite wizard can help me?

    Read the article

  • Subdomain still times out after set up a month ago

    - by user8137
    I'm a newbie on this and this has probably been asked already but the subjects online were close but too vague in there answers so I've probably really messed this up. I would really appreciate specific step by step instructions. This is what I'd like to do: use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like ftp). We use Network Solutions to house domain.com. We recently added a new ip address to point to www.high-res.domain.com. I gave the ip address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct ip address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). Tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a Cname when you look at it. Under the Domain it has two entries one to the subdomain and the other to the website host each with separate ip addresses. Is this correct? The website people are too busy on another project to research it further and my friends haven't gotten back to me. Please help. Thanks KK

    Read the article

  • Webserver insists on opening "blog1.php" instead of "index.php"

    - by pepoluan
    I'm at my wits' end. I have just ripped out a website and in the process of rebuilding everything. Previously, the 'home page' of the website is a blog, with the address "www.mydomain.com/blog1.php". After exporting everything, I deleted the whole directory, and -- based on request -- immediately create a blog/ directory. The idea is to get the blog back up as soon as possible, and temporarily redirect people accessing www.mydomain.com to the blog. Accessing the blog via http://www.mydomain.com/blog/ works. So I put in an index.php file containing a (temporary) redirect to the blog's address. The problem: The server insists on opening blog1.php instead of index.php. Even after we deleted all the files (including .htaccess). And even putting in a new .htaccess file with the single line of DirectoryIndex index.php doesn't work. The server stubbornly wants blog1.php. Now, the server is actually a webhosting, so I have no actual access to it. I have to do my work via cPanel. Currently, I work around this issue by creating blog1.php; but I really want to know why the server does not revert to opening index.php. Did I perhaps miss some important settings in the byzantine cPanel menu page?

    Read the article

  • how can i cahe one more web site on same backend server (web server) with varnish?

    - by Kerberos
    i have one web server which is IIS that is back on varnish. there are more web sites on ISS. there are all web sites header's on IIS and all web sites publish from port 80. can i cache all web site by varnish like below code;backend cacheWebSite{.host = "192.168.0.1"; .port = "80";} sub vcl_recv {if (req.http.host == "www.example1.com") {set req.backend = CacheWebSites;} if (req.http.host == "www.example2.com") {set req.backend = CacheWebSites; } if (req.http.host == "www.example3.com") {set req.backend = CacheWebSites; }} i can't test this code. that is just senario. thank you for your help already now.

    Read the article

  • Apache2, can't apply Directory access

    - by skomak
    Hi, i can't figure out how apply deny access to a directory. Here is my config: <VirtualHost x.x.x.x:80> DocumentRoot /var/www/html/wwwhtml ServerName mydomain.com ServerAdmin [email protected] ErrorLog /var/log/httpd/mydomain_error.log TransferLog /var/log/httpd/mydomain_access_log Alias /test /var/www/html/wwwhtml/eventum <Directory /var/www/html/wwwhtml/eventum> Order deny,allow Deny from all #Allow from 192.168.0 </Directory> I deny access to /test but it doesn't work, on my another server it works perfectly :/ Do you know what can cause that problem? How to solve it? It is not whole config but the most important part. Maybe file rewrites can cause it? Thanks in advance.

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >