Search Results

Search found 33029 results on 1322 pages for 'access vba'.

Page 522/1322 | < Previous Page | 518 519 520 521 522 523 524 525 526 527 528 529  | Next Page >

  • I need a reverse proxy solution for SSH

    - by Bond
    Hi here is a situation I have a server in a corporate data center for a project. I have an SSH access to this machine at port 22.There are some virtual machines running on this server and then at the back of every thing many other Operating systems are working. Now Since I am behind the data centers firewall my supervisor asked me if I can do some thing by which I can give many people on Internet access to these virtual machines directly. I know if I were allowed to get traffic on port other than 22 then I can do a port forwarding. But since I am not allowed this so what can be a solution in this case. The people who would like to connect might be complete idiots.Who may be happy just by opening putty at their machines or may be even filezilla.I have configured an Apache Reverse Proxy for redirecting the Internet traffic to the virtual machines on these hosts.But I am not clear as for SSH what can I do.So is there some thing equivalent to an Apache Reverse Proxy which can do similar work for SSH in this situation. I do not have firewall in my hands or any port other than 22 open and in fact even if I request they wont allow to open.2 times SSH is not some thing that my supervisor wants.

    Read the article

  • Exclamation 403 forbidden for cgi-bin/ and cannot protect site with password

    - by gasgdasdgasdg
    First problem i have is i am getting 403 forbidden error for cgi-bin/ I have created a new /var/www2/ i can access it fine. php runs fine. Second problem is I cannot password protect it. i first tried doing htpasswd, it asks for login but everytime i login it keeps asking for new one. its getting frustrating, i have tried all tricks. and doesn't seem to work. this is a virtual host config inside sites-available. httpd.conf is empty but i have apache2.conf Code: NameVirtualHost 12.12.12.12. <VirtualHost 12.12.12.12> ServerAdmin webmaster@localhost DocumentRoot /var/www2/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www2/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /var/www2/cgi-bin/ <Directory "/var/www2/cgi-bin/"> AllowOverride Options Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch AddHandler cgi-script cgi pl Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined ServerSignature On Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost>

    Read the article

  • Windows Explorer and UAC: run elevated

    - by syneticon-dj
    I am profoundly annoyed by UAC and switch it off for my admin user wherever I can. Yet, there are situations where I can't - especially if those are machines not under my continuous administration. In this case, I am always challenged with the task of traversing directories using my administrative user via the Windows Explorer where regular users do not have "read" permissions. The possible two approaches to this problem so far: change the ACLs to the directory in question to include my user (Windows conveniently offers the Continue button in the "You don't currently have permissions to access this folder" dialog. This obviously sucks since more often than not I do not want to change ACLs but just look into the folder's contents use an elevated cmd.exe prompt along with a bunch of command line utilities - this usually takes a lot of time when browsing through large and / or complex directory structures What I would love to see would be a way to run Windows Explorer in elevated mode. I have yet to find out how to do so. But other suggestions solving this problem in an unobtrusive way without changing the entire system's configuration (and preferably without the need for downloading / installing anything) are very welcome, too. I have seen this post with a suggestion for altering HKCR - interesting, but it changes the behavior for all users, which I am not allowed to do in most situations. Also, some folks have suggested using UNC paths to access the folders - unfortunately this does not work when accessing the same machine (i.e. \\localhost\c$\path) as the "Administrators" group membership is still stripped from the token and a re-authentication (and thus the creation of a new token) would not happen when accessing localhost.

    Read the article

  • Virtualbox Networking: XP Guest, Ubuntu Host: Connecting to Windows servers & local network?

    - by user51833
    Here's what I have: Windows XP running in VirtualBox 3.0.8_OSE r53138; Host OS = Ubuntu 9.10 "Karmic Koala"; Windows network in my office with smb fileservers; Guest OS is connected to the internet and is sharing folders with Host OS; Limited networking expertise. Here's what I actually need to do: Use MS Outlook in my XP guest with all its calendar-sharing features and stuff (if this is all done through the internet then great) - or find a Linux app that can do the same stuff; Map Windows network servers, eg. smb://server01/ in my XP guest (I can already access these in Ubuntu. Here's what I've tried with no luck: Entering the server address (example above) in my XP guest windows explorer address bar (got a "could not access the file, path or drive" error message - maybe if I could enter login/pass information? But I don't know how); Mapping the server as a network drive (Windows could not find the path); Mounting the server as one of my shared folders (I couldn't find it through the shared folders browser in VirtualBox - is there somewhere in the Linux filesystem that Ubuntu keeps links to mounted servers?).

    Read the article

  • What can I do with a home server?

    - by Joel Coehoorn
    I have an old 700 Mhz Pentium III at home running Windows 2000 Server, with a home router set up to pass incoming requests to it and a DynDNS account set up so it's easy to find. Right now I'm using it for a number of things: Shared folders + backup inside the home network Shared Printer inside the home network Domain Controller, just because I feel like it and because it's useful to me as practice to keep those "enterprise" administration skills. Web Server FTP remote access for my files. I abandoned this for security reasons, but it's still worth leaving visible. Remote Desktop in to the home network (thinking about adding VPN service) SVN repository MySQL - Will be moving to SQL Server 2008 Standard soon. After I upgrade my wife's laptop from home to pro later this year it will also become a domain controller It's the only place I still have access to Internet Explorer 6 any more without setting up a new virtual machine, so I use it for testing code with that browser. The question is: What else could I be doing with this machine? Update Additional ideas based on the suggestions: Media Server/DVR Build server PBX SSH Proxy Server Continuous Integration Server Personal OpenID Provider Update2 Just a note that this server was recently upgraded to an Atom330 with 2 GB ram and bigger hard drive. For all that's slow for a "modern" cpu, it should still be much faster than the old Pentium III and the expected power savings should make the upgrade essentially free over the course of the next year or two. Also, it's now running Windows Server 2008.

    Read the article

  • On linux, what does it mean when a directory has size 0 instead of 4096?

    - by kdt
    Here's a strange thing I haven't seen before -- a directory whose size is reported by ls as 0 instead of 4096, and I can't create any files within it. # ls -ld lib home drwxr-xr-x. 2 root root 0 Feb 7 03:10 home <-- it has zero size dr-xr-xr-x. 11 root root 4096 Feb 4 09:28 lib # touch home/foo touch: cannot touch `home/foo': No such file or directory <-- and I can't create files in it # rm home rm: cannot remove `home': Is a directory <-- look, it really is a dir So what does it mean for a directory to have size 0 instead of 4096? Filesystem is ext4 on fedora core 14. The output of mount is: /dev/mapper/vg_dev-lv_root on / type ext4 (rw) proc on /proc type proc (rw) sysfs on /sys type sysfs (rw) devpts on /dev/pts type devpts (rw,gid=5,mode=620) tmpfs on /dev/shm type tmpfs (rw,rootcontext="system_u:object_r:tmpfs_t:s0") /dev/vda1 on /boot type ext4 (rw) none on /proc/sys/fs/binfmt_misc type binfmt_misc (rw) sunrpc on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw) Output of du -s /home: 0 /home Output of stat /home: File: `/home' Size: 0 Blocks: 0 IO Block: 1024 directory Device: 15h/21d Inode: 34913 Links: 2 Access: (0755/drwxr-xr-x) Uid: ( 0/ root) Gid: ( 0/ root) Access: 2011-02-07 03:45:46.188995765 -0800 Modify: 2011-02-07 03:11:59.980995019 -0800 Change: 2011-02-06 07:58:45.874995002 -0800

    Read the article

  • 403 Error when accessing vhost directive

    - by Ortix92
    I'm having some troubles with setting up my webserver (Centos 5.8). It's a brand new server and I'm trying to set a vhost to the following dir: /home/exo/public_html However whenever I restart httpd I get the following warning: Code: Starting httpd: Warning: DocumentRoot [/home/exo/public_html] does not exist Yes the directory does exist. So whenever I visit the domain exo-l.com it gives me a 403 error. This is my config file (I put this inside my httpd.conf because the files in conf.d were not included for some reason. Or at least my newly created vhost conf file, but that has 0 priority for now) <VirtualHost *:80> DocumentRoot /home/exo/public_html ServerName www.exo-l.com ServerAlias exo-l.com <Directory /home/exo/public_html> Order allow,deny Allow from all </Directory> </VirtualHost I'm completely clueless because this should work as far as I know. httpd is being run as apache:apache i tried chowning the public_html directory (also recursively) to exo:apache, apache:apache, root:root with no success. chmod 777 doesn't do anything either. a tail from the log: [Sat Oct 13 15:10:04 2012] [error] [client 82.***.***.61] (13)Permission denied: access to / denied [Sat Oct 13 15:10:04 2012] [error] [client 82.***.***.61] (13)Permission denied: access to / denied I also found something about selinux and that disabling it might help, but do I really want to do that?

    Read the article

  • nginx redirect what is not coming from load balancing

    - by dawez
    I have nginx on SERVER1 that is acting as load balancing between SERVER1 and SERVER2 in SERVER1 I have the upstreams for the load balancing defined as : upstream de.server.com { # similar upstreams defined also for other languages # SELF SERVER1 server 127.0.0.1:8082 weight=3 max_fails=3 fail_timeout=2; # other SERVER2 server otherserverip:8082 max_fails=3 fail_timeout=2; } The load balancing config on SERVER1 is this one: server { listen 80; server_name ~^(?<LANG>de|es|fr)\.server\.com; location / { proxy_pass http://$LANG.server.com; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; # trying to pass a variable in the header to SERVER2 proxy_set_header Is-From-Load-Balancer 1; } } Then in server 2 I have: server { listen 8082; server_name localhost; root /var/www/server.com/public; # test output values add_header testloadbalancer $http_is_from_load_balancer; add_header testloadbalancer2 not_load_bal; ## other stuff here to process the request } I can see the "testloadbalancer" in the response header is set to 1 when the request is coming from the load balancing, it is not present when from a direct access: SERVER2:8082 . I would like to bounce back to the SERVER1 all the direct requests that are sent to SERVER2, but keep the ones from the load balancing. So this should forbid direct access to SERVER2:8082 and redirect to SERVER1:80 .

    Read the article

  • Linux security: The dangers of executing malignant code as a standard user

    - by AndreasT
    Slipping some (non-root) user a piece of malignant code that he or she executes might be considered as one of the highest security breaches possible. (The only higher I can see is actually accessing the root user) What can an attacker effectively do when he/she gets a standard, (let's say a normal Ubuntu user) to execute code? Where would an attacker go from there? What would that piece of code do? Let's say that the user is not stupid enough to be lured into entering the root/sudo password into a form/program she doesn't know. Only software from trusted sources is installed. The way I see it there is not really much one could do, is there? Addition: I partially ask this because I am thinking of granting some people shell (non-root) access to my server. They should be able to have normal access to programs. I want them to be able to compile programs with gcc. So there will definitely be arbitrary code run in user-space...

    Read the article

  • Can not open ports in iptables on CentOS 5??

    - by abszero
    I am trying to open up ports in CentOS's firewall and am having a terrible go at it. I have followed the "HowTo" here: http://wiki.centos.org/HowTos/Network/IPTables as well as a few other places on the Net but I still can't get the bloody thing to work. Basically I wanted to get two things working: VNC and Apache over the internal network. The problem is that the firewall is blocking all attempts to connect to these services. Now if I issue service iptables stop and then try to access the server via VNC or hit the webserver everything works as expected. However the moment I turn iptables back on all of my access is blocked. Below is a truncated version of my iptables file as it appears in vi -A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 5801 -j ACCEPT -A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 5901 -j ACCEPT -A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 6001 -j ACCEPT -A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 5900 -j ACCEPT -A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 80 -j ACCEPT Really I would just be happy if I could get port 80 opened up for Apache since I can do most stuff via putty but if I could figure out VNC as well that would be cool. As far as VNC goes there is just a single/user desktop that I am trying to connect to via: [ipaddress]:1 Any help would be greatly appreciated!

    Read the article

  • How do I change file protections running XP on a disk from Windows Server?

    - by cdkMoose
    I had a Windows Server 2003 machine running at home, along with my desktop which I use for development. Server went belly up, but since my desktop is reasonably powerful, I figured I would move the disk from the file server (it was OK) into my XP machine to keep all of the files. Disk comes up fine and shows all of the files. I have been getting access denied errors when trying to work with some of the files. When I display attributes in Explorer, none of them are marked Read-Only. When I view properties on the directories, the Read-Only checkbox is not checked, but has a green background(which I thought meant mixed usage for files in the directory). When I click on the checkbox to clear it and click Apply, the disk does some work and all looks well. However, I continue to get the Access Denied errors, the files still don't show any Read-Only attribute and the directory properties shows the green background again on the Read-Only checkbox. I did check the box which says to apply the change to the folder and all files /subfilders under it. I am assuming that the issue relates to userids/permissions carried over from the Server install. So, why does it let me think I can change the attribute when I can't and how can I correct this problem so that the disk correctly recognizes the ids from XP?

    Read the article

  • How can I create a VLAN on my extreme switch for a seperate subnet/domain?

    - by drpcken
    I'm putting together a small active directory implementation for a buddy of mine. I currently have 2 servers (one is the primary domain controller) and a couple clients. I need to test and run updates on every machine on this domain, but I would have plug them into my current LIVE domain to get it internet access. From what I've read having two separate domains on a single subnet is a bad idea (even though it is temporary) so I don't want to risk messing anything up on my production domain. I'm pretty sure I can create a separate VLAN on my extreme 48 port switch and plug this smaller domain into it on a different subnet, but I don't know the commands. Both subnets would need internet access of course (one of the things I can't wrap my head around is routing internet traffic between subnets (gateway is on production subnet). My production domain is on subnet 192.168.200.0. My new domain I want to put online would go into subnet 192.168.10.0. A shove in the right direction would be greatly appreciated. Thank you!

    Read the article

  • nginx proxying different servers for different subdomains

    - by The.Anti.9
    i just set up an nginx server. On the same computer as nginx, I have apache running on port 8000 (this was previously set up.) and I want no subdomain and the www. subdomain to go to the local apache instance. But i want the stuff. subdomain to link to my server where i keep all my miscellaneous files (pictures, documents, etc.), which is also listening on port 80 at the ip 192.168.1.102. I tried configuring it, but when i go to my domain, I just get the "Welcome to nginx!". Here's what I have: user www-data; worker_processes 1; error_log /var/log/nginx/error.log; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include /etc/nginx/mime.types; default_type application/octet-stream; sendfile on; #tcp_nopush on; #keepalive_timeout 0; keepalive_timeout 65; tcp_nodelay on; gzip on; include /etc/nginx/conf.d/*.conf; server { listen 80; server_name theanti9.com www.theanti9.com; access_log /var/log/nginx/access.log; location / { proxy_pass http://localhost:8000; } } server { listen 80; server_name stuff.theanti9.com; access_log /var/log/nginx/access.log; location / { proxy_pass http://192.168.1.102:80; } } } I'm not really sure what's wrong. Any suggestions?

    Read the article

  • Sharing between Vista and Windows 7

    - by Metro Smurf
    Vista Ultimate 32 bit Windows 7 Ultimate 64 bit I've read through similar questions about sharing between Win7 and Vista, but none of them have resolved my issue of not being able to share between Win7 and Vista: Connecting to a Vista shared folder from Windows 7 Networking Windows 7 and Vista Enable File sharing in Windows Vista Previously I had previously had my Vista and XP system sharing back and forth without any problems. I was able to access the shares without entering a user name / password in the NT challenge prompt (note: account names and passwords were different on the Vista and XP systems). Currently I replaced my XP system with a Win7 system. Now, when I attempt to access shares to/from Vista / Win7, I am continually prompted with an NT challenge to enter my credentials. Things I've Verified/Tried Both systems are on the same workgroup. Win 7 is using the Home network. Vista is using the Private network. In other words, neither system is using a Public network profile. Enabled file sharing with and without password protection on both Vista and Win7 Tried HomeGroup Connections (win7) with Windows to manage connections and Use user accounts to connect. Reviewed too many online articles to count to trouble shoot. Set the shares to have full control by everyone. Set up the shares directly on the directory and through the share manager. My Question How can I enable file sharing between Vista and Win7 without being prompted with a username/password challenge, ever?

    Read the article

  • Ngix rewrite is not working as expected

    - by SamFisher83
    I am trying to learn how to use nginx and how to use its rewrite functionality Nginx seems to be doing the rewrite: 2012/03/27 16:30:26 [notice] 16216#0: *3 "foo.php" matches "/foo.php", client: 61.90.22.223, server: localhost, request: "GET /foo.php HTTP/1.1", host: "domain.com" 2012/03/27 16:30:26 [notice] 16216#0: *3 rewritten data: "img.php", args: "", client: 61.90.22.223, server: localhost, request: "GET /foo.php HTTP/1.1", host: "domain.com" but in my access log I am getting the following: 61.90.22.223 - - [27/Mar/2012:16:26:54 +0000] "GET /foo.php HTTP/1.1" 404 31 "-" "Mozilla/5.0 (Windows NT 6.1; rv:11.0) Gecko/20100101 Firefox/11.0" 61.90.22.223 - - [27/Mar/2012:16:30:26 +0000] "GET /foo.php HTTP/1.1" 404 31 "-" "Mozilla/5.0 (Windows NT 6.1; rv:11.0) Gecko/20100101 Firefox/11.0" There is an img.php in the root directory so I am not sure why I am getting a 404 error Here is part of the configuration block: rewrite foo.php img.php last; location / { try_files $uri $uri/ /index.html; } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; include fastcgi_params; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # location ~ /\.ht { deny all; }

    Read the article

  • How to setup Joomla CMS as a backend for iPhone app

    - by srik
    I would like my iPhone app to get dynamic content off the net. This content should be managed using a CMS. I have gone ahead and installed Joomla on my server and will be using the Joomla web interface to create and manage content. I would now like the iPhone app to login to my server and fetch the content. I do not want the complete web pages for my iPhone app. Instead, I want the content in the form of XML or JSON or some serialized format so that I can use the data in a custom layout native to the app. So I am looking for 2 things in particular: 1. How to setup HTTP based authentication for my iPhone app to access data from my server. 2. How to access the content in a serialized format (XML, JSON etc) Are there plugins/extensions/components I can use to achieve the same. Any advice on how this can be achieved would be helpful. I am completely new to setting up/using CMS.

    Read the article

  • How to setup Joomla CMS as a backend for iPhone app

    - by srik
    I would like my iPhone app to get dynamic content off the net. This content should be managed using a CMS. I have gone ahead and installed Joomla on my server and will be using the Joomla web interface to create and manage content. I would now like the iPhone app to login to my server and fetch the content. I do not want the complete web pages for my iPhone app. Instead, I want the content in the form of XML or JSON or some serialized format so that I can use the data in a custom layout native to the app. So I am looking for 2 things in particular: 1. How to setup HTTP based authentication for my iPhone app to access data from my server. 2. How to access the content in a serialized format (XML, JSON etc) Are there plugins/extensions/components I can use to achieve the same. Any advice on how this can be achieved would be helpful. I am completely new to setting up/using CMS.

    Read the article

  • Extending ext4 partition on debian7.0 on vsphere

    - by VoidPointer
    I have allocated thin provisioning of 15GB when i found 8GB as insufficient. Now debian guest is not able to recognize the change of size. root@debian7-x64:~# lvdisplay --- Logical volume --- LV Path /dev/debian7-x64/root LV Name root VG Name debian7-x64 LV UUID EU6mg0-XTXC-ci3D-bQJi-7XN6-r8Hp-SYxcj0 LV Write Access read/write LV Creation host, time debian7-x64, 2013-06-25 12:02:49 +0530 LV Status available # open 1 LV Size 7.39 GiB Current LE 1892 Segments 1 Allocation inherit Read ahead sectors auto - currently set to 256 Block device 254:0 --- Logical volume --- LV Path /dev/debian7-x64/swap_1 LV Name swap_1 VG Name debian7-x64 LV UUID xDNtoz-tJUq-M5D6-GGCN-gzcD-fwUv-fYYDR1 LV Write Access read/write LV Creation host, time debian7-x64, 2013-06-25 12:02:49 +0530 LV Status available # open 2 LV Size 376.00 MiB Current LE 94 Segments 1 Allocation inherit Read ahead sectors auto - currently set to 256 Block device 254:1 root@debian7-x64:~# pvdisplay --- Physical volume --- PV Name /dev/sda5 VG Name debian7-x64 PV Size 7.76 GiB / not usable 2.00 MiB Allocatable yes (but full) PE Size 4.00 MiB Total PE 1986 Free PE 0 Allocated PE 1986 PV UUID SehkzH-Gq8Y-jI2f-27Tb-uv1Z-tR1R-5OnTxR root@debian7-x64:~# sfdisk -s /dev/sda: 15728640 /dev/mapper/debian7--x64-root: 7749632 /dev/mapper/debian7--x64-swap_1: 385024 total: 23863296 blocks Help me to extend this partition. No problem in rebooting. I dont have any live CD. Environment : debian 7, with lvm, on vsphere, ext4 partition. Can provide more details when needed.

    Read the article

  • Multiple Rails apps on same subdomain?

    - by Derek
    I recently decided to try out Rails. When working with PHP, I simply had all of my PHP projects in the same directory. For example, I may have http://ubuntu/app1, http://ubuntu/app2, etc. I created a subdomain for Rails (http://ruby.ubuntu), installed Rails and Passenger and everything is working. However, I may be wrong, but it looks like I can only have one Rails app per subdomain? My VirtualHost is as follows: <VirtualHost *:80> ServerName ruby.ubuntu ServerAdmin webmaster@localhost DocumentRoot /var/www/ruby/blog/public <Directory /var/www/ruby/blog/public> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all RailsEnv development </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> All of my PHP and misc. files are stored in /var/www/main. I want to be able to store all of my Rails apps in /var/www/ruby. I tried changing DocumentRoot to /var/www/ruby, but I don't think it's as simple as that. When I browse to a Rails app's Welcome Aboard page and click on "About my application's environment," I get a 404 page, but when the DocumentRoot is set to the public directory, I get the expected result. I don't want to have to create a new subdomain every time I create a new project. Is there any way I can make it so I can store all of my apps in /var/www/ruby, and browsing to http://ruby.ubuntu will let me access all of my Rails apps there? That way if I want to create a new app, all I have to do is rails new app, no Apache .htaccess or VirtualHost configuration required.

    Read the article

  • CentOS 6.5 x64 + Ajendi +nginx + php-fpm + mysql setup unable to load the PHP page

    - by Francis
    I'm using Ajendi to setup a PHP web server, I can load the PHP info page, when I try to load the PHP copy from other server, I get a blank page with this log appear on access log, I have no clue what was wrong and troubleshoot further, please advice. x.x.x.x - - [28/May/2014:10:08:37 -0400] "GET /index.php HTTP/1.1" 200 31 "-" "Mozilla/5.0 (Windows NT 6.1; rv:29.0) Gecko/20100101 Firefox/29.0" 2.6.32-431.1.2.0.1.el6.x86_64 cat /etc/redhat-release CentOS release 6.5 (Final) The vhost config AUTOMATICALLY GENERATED - DO NO EDIT! server { listen *:80; server_name test.com www.test.com; access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; root /www; index index.html index.htm index.php; location ~ \.php$ { alias /www; fastcgi_index index.php; include fcgi.conf; fastcgi_pass unix:/var/run/php-fcgi-php-fcgi-0.sock; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } }

    Read the article

  • How do I cancel windows server 2003 repair install?

    - by Kilgore2k
    System: Windows 2003 Server Enterprise Scenario: NTDS db is corrupt and all attempts to fix with esentutl fail. Ran chkdsk which seemed to repair disk error and give access to the ntds.dit file but still esentutl fails. (Attached the drive to a different server to run the esentutl) Error: Access to source database '[path to copy of]/ntds.dit' failed with Jet error -1022. Operation terminated with error -1022 (JET_errDiskIO, Disk IO error) after 0.170 seconds. This error occurs on any disk I cpoy the files to including original location in C:\WINDOWS\NTDS\ Now enter the "Stupid!" and "what was I thinking!?" part (must be the late hour...) Stupid: No updated backup - after using a backup I get a network password error in the lsass error. what was I thinking!?: Started the install repair from the original CD but the install fails since the AD fails to start. Now I cant boot into any mode (safe mode, AD restore etc) nor complete the repair install. I would really like to avoid a fresh install since I have the Exchange server on this DC and would rather migrate to a new server than have to start from scratch. Thanks!

    Read the article

  • Family server setup

    - by Manny
    Hi all, I really hope some of you can give me some direction. I have setup a linux server at home and through samba I can access files from different computers in my home. I would like to use this server as a file-server for my family (brothers, sisters and parents who all live in their own homes). I really like the way it is set up right now with user and permission controls, but I've read that it is bad idea to open up the samba port to the world. The requirements are simple: 1) it should be easy to access, by using standard web browsers or mounting the drive (shouldn't have to use any VPN setup or use putty etc) 2) should be somewhat secure. We just want to share family pictures instead of putting them on facebook or picasa or other web server, nothing top secret. Here is what I've looked into: 1)Webdav. It seems decent but seems like it windows7 doesn't like it very much, even with digest mode authentication. User controls and permissions are not as flexible as samba (or at least to my knowledge). I really like the user and group permissions in samba, but if I could live with webdav if it worked seamlessly with windows, it should just work shouldn't it? 2) I read somewhere to stay away from ftp as it is outdated and that there are newer and better internet file-server setups? Was that a reference to webdav? I am so confused, please help... Manny

    Read the article

  • Change source address based on destination IP

    - by hgj
    We have several "router" machines that gather a lot of external IP addresses on the same host and redirect, NAT or proxy the traffic to the internal network. They also act as routers for the machines on the internal network. This works fine, however I am unable to make the routing table, so I can change the source address, based on the destination a machine from the internal network want to access. Let's say I have a router, that has public addresses P1 (5.5.5.1/24) and P2 (5.5.5.2/24). All traffic goes through P1, but if necessary, the host is reachable on P2 too. This looks like this and works fine: > ip addr ... 1: eth1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP qlen 1000 link/ether aa:bb:cc:dd:ee:11 brd ff:ff:ff:ff:ff:ff inet 5.5.5.1/24 brd 5.5.5.255 scope global eth1 inet 5.5.5.2/24 brd 5.5.5.255 scope global secondary eth1:p2 ... Now I want to use P2 as the source address, if I want to access the Google DNS service for example (8.8.8.8). So I add a row in the routing table like: > ip route add 8.8.8.8 via 5.5.5.254 dev eth1 src 5.5.5.2 > ip route ... default via 5.5.5.254 dev eth1 5.5.5.0/24 dev eth1 proto kernel scope link src 5.5.5.1 8.8.8.8 via 5.5.5.254 dev eth1 src 5.5.5.2 ... But this does not work. If I ping 8.8.8.8, the host still uses P1 as the source address, and does not use P2 at all for outgoing connections. Am I doing it right? I guess not...

    Read the article

  • Plesk FTP not working but SFTP and Shell is working

    - by shamittomar
    I am facing a strange problem. The FTP on my Plesk VPS is not working. Whenever I try to connect, FileZilla FTP client says: Status: Resolving address of xxxxxxxxxxxxx.com Status: Connecting to xxx.xxx.xxx.xxx:21... Status: Connection established, waiting for welcome message... Error: Could not connect to server So, it's not even going to the step of asking username/password. So, it's something else. The SFTP on port 22 is working fine. Also, I can successfully do shell access and run commands. But, I NEED FTP access too on port 21. I have searched everywhere but can not find any setting to enable it. This is the Plesk version info: Parallels Plesk Panel version 9.5.2 Operating system Linux 2.6.26.8-57.fc8 CPU GenuineIntel, Intel(R) Pentium(R) 4 CPU 3.00GHz Any help is appreciated. [EDIT]: The firewall is not blocking it. I have checked it on server and there are absolutely no blocking rule. Firewall states: All incoming/outgoing connections are accepted on FTP And on client-side (my PC), I can connect to other FTP servers so this is not an issue in my PC's firewall. Moreover, I can not even connect to the FTP from online FTP clients like net2ftp.

    Read the article

  • Windows XP Disappearing Folders

    - by XenoFoxx
    I am researching a problem for a friend, and unfortunatly do not have direct access to his computer. I've tried to gather as much information as possible and I have researched it on various websites. I've not found anyone having the same problem my friend is having. So here goes: He has a media server in his home running Microsoft Windows XP. It has 3 drives, 1 for the OS and 2 for mass storage. Not long ago he went to access one of the mass storage media drives and it was empty, except for a single folder. His first assumption was that his roommate had deleted everything on the drive (excluding the remaining folder). He then checked the properties of the drive and it was still saying that the hard drive was nearly full. I told him to check the recycling bin, thinking that whoever deleted them didn't clear them from recycling and that they were still taking up space on the drive. My friend said the recycling bin was empty. So we have a drive that the windows file management system says is empty (again except for the remaining folder), but the properties of the drive say it's mostly full. Now it gets weirder My friend tried to create a new folder on this drive and it auto-named itself "New Folder(1)" which means that it recognizes there is already a "New Folder" in that directory. He tried to rename it to a name that he KNEW was there previsouly, and windows wouldn't allow it because it was a duplicate folder name. SO now it seems the folders are there, but not displaying in Windows Explorer. Both of us have no idea why this is occuring, why the folders vanished, why the one remaining folder didn't vanish, or how to make them visable again. Anyone else ever experience this? I can get more details if needed.

    Read the article

< Previous Page | 518 519 520 521 522 523 524 525 526 527 528 529  | Next Page >