Search Results

Search found 929 results on 38 pages for 'patrick klug'.

Page 13/38 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • htaccess for subdomain help

    - by Patrick
    Usually I just use the online tools for url mod_rewrite rules but this just wouldn't work. Dynamic url: http://sub.domain.com/index.php?page=index&name=test Rewritten url: http://sub.domain.com/test OR http://sub.domain.com/test/ My htaccess: RewriteRule ^([^/]+)/?$ index.php?page=index&name=$1 [L] Instead of passing "test" for the variable name, I always get the value "index.php" Anyone gurus has have any idea?

    Read the article

  • How do I connect to my home's primary wired network through an extra (wireless) router?

    - by Matthew Patrick Cashatt
    Thanks for looking! I have set up a desktop PC in my workshop. The Cat 5 cable connects from this PC to a wireless router which is connected to my home network. The Internet connection is working just fine. However, the "wired" network this is on shows up as a different wired network than the one that the PCs inside my house are connected to. This is a problem because I would like to connect this workshop PC to various shared resources like printers, HD Homerun (cable tv card), shared drives, etc. When I go to "Network and Sharing" and attempt to find the network that the PCs inside my home are connected to, I don't see it. Any help is appreciated. Thanks!

    Read the article

  • Automatically reboot Windows8 if no internet activity [migrated]

    - by Patrick
    I have a media server located in a very inconvenient part of my house. Occasionally I will have to reset my router or it will reset itself. The issue is the PC loses connectivity for some reason, and I am forced to walk outside, around the house, into the basement, over a bunch of toys and weights and boxes, to push a button to reboot it. I would love to have it check itself every 5-10 minutes and auto reboot if it is unable to ping a given address/IP. Any ideas how to accomplish this?

    Read the article

  • Changing wallpaper depending on time of day via script or batch file?

    - by Patrick
    I want to have 2 different wallpapers that change according to time of day (6 and 22 hours respectively) and only want to display the night one after 22 hours and the day one only after 6 hours and until 22 hours. I didn't find a program that can do this after a standby, so I thought it should be easy to realize with the task scheduler running a script. Now the question is not only how to realize such a script, but also if the script should include the time checking or the task scheduler. I'm not sure what would work better with long times of the PC being in standby. I tried a few scripts already from similar questions and hoped I could modify to them to my needs, but they didn't work at all. Anyone able to help me? TIA.

    Read the article

  • List existing file server permission groups/users

    - by Patrick
    So we have taken over a new client and their existing file server is frankly a mess. We have migrated their old file server from a 2k box to a 2k8 DFS cluster and now I'm looking at rebuilding both the folder structure and their permissions. Unfortunately its been half done with AD groups (poorly named/no description/notes) and half with individuals named in security on the folders themselves. What I'm looking to do is to dump a complete list of all the folders with their security permissions (ideally I'd like to ignore files but not essential). CACLS got me half way there but fails with an odd error message and its output isn't particularly user friendly and I'm working with roughly 2Tb/250,000 files here so I really need something that gives me a bit more functionality. Question : does anyone have any experience of something similar/know of a bit of software that might help me out?

    Read the article

  • Prevent Xbox users from editing Media Library

    - by Patrick
    I am trying to watch some videos stored on my desktop computer on the Xbox, but they are in a format that the Xbox cannot decode, so I have to stream them through Windows Media Center. However, as soon as I set up Media Center on the Xbox, anyone can go in and browse the directory structure on my desktop. I would like to "lock" the media library so that only I can add and delete folders from my desktop computer. Is it possible?

    Read the article

  • Verizon Fivespot firewall

    - by Patrick
    I have a Verizon Fivespot Wi-Fi router and am having issues connecting to the computer that uses it to get on the internet. I am able to connect to the Fivespot admin pages remotely and I am able to connect to the internet from the computer behind the Fivespot. There are two sections pertinent to this issue, Port Filtering And, Port Forwarding I've tried each individually and both together but cannot access anything through the router except for the admin page. I am trying to connect through SSH to an Ubuntu 10.04 box over wifi. I have called Verizon Tech Support but they were unhelpful, the person essentially read what it says on each screen without any elaboration. Any help is greatly appreciated!

    Read the article

  • I cannot connect to database from Drupal

    - by Patrick
    hi, I've uploaded my drupal website (and related database) to my new server. The database info is: host: localhost user: user pass: pass databaseName = database_name I've set the following line in settings.php file: $db_url = 'mysqli://user:password@localhost/database_name'; but what I get is this: If you are the maintainer of this site, please check your database settings in the settings.php file and ensure that your hosting provider's database server is running. For more help, see the handbook, or contact your hosting provider. I guess the database is running, it always run and I can access with phpmyadmin so I think the problem is not there. The database and website files upload have also been succesfull.. so I dunno what to do to fix this issue. It is mysql on IIS Server thanks

    Read the article

  • How should I organize my backups ?

    - by Patrick
    I'm using for the first time rsync to create daily backups of my websites and I was wondering if I should overwrite the previous copy or should I create multiple copies and overwrite only the oldest one ? (I might not have enough space for that, though). I actually have also this question. Let's suppose most of files are accidentally erases.. does rsync delete all these files from the backup space because they don't exist anymore ? How does exactly work in this case ? thanks

    Read the article

  • How to associate localhost in Snow Leopard to a specific ip ?

    - by Patrick
    I've disabled the web server on Snow Leopard, and I'm using an emulated Ubuntu with Lighttpd web server. In order to access to the web pages I need to specify the ip of the emulated machine. However I now need to associated such ip with "localhost" in Leopard environment. When I type localhost in Leopard I actually want to visit the localhost on the Ubuntu machine. Do I need apache on leopard to make the forward or can I change network settings in Leopard or what ? thanks

    Read the article

  • I killed my VPS with putting Firefox to reload the page after 1 second, now one of my domains is dead and gives a 504

    - by Patrick De Amorim
    I have one domain on my VPS which I wanted to run a script a few hundred times on to simulate users, so I put a few Firefox tab on refresh every 1 second. Now that domain is dead, it just gives me a 504 every time I try to go to it, even though I restarted NGINX, PHP and I even shut the server on and off. All of my other sites on that VPS are Ok and running perfectly, just this one where I tried the reloading on. What can I do from here?

    Read the article

  • DFS Root namespace is RDWR for all users

    - by Patrick
    We have an existing DFS Replication and Namespace group that we use to serve the company's files. This has been operating fine for us for some time now, and continues to do so. however a situation arose yesterday afternoon that has led us to be stumped. The problem is that we have our name space presented as : \\domain.co.uk\public\[8 or 9 folders that are mapped to the users in the business] We had a problem this morning that meant that a number of users started mapping their AD Home Drive directly to the \\domain.co.uk\public directory and we found that they had read/write. This rapidly became a problem as a at least one director saved some moderately sensitive documents in there and basically anyone could read them. I've tidied up that specific problem with some deft scripting and a slight modification of group policy. However I would like to make \public read only, the trouble is I can't work out where the ACLs for that folder would be held. All the folders that are presented as \\domain.co.uk\public\[folder] are 'real' folders on logical volumes on our DFS servers so are secured with groups that are applied via the 'security' tab. I'd like to do the same on \public but I can't find it. I have looked through amongst other things \Sysvol\domain.co.uk but can't find it and after a lot of clicking and a bit of reading I can't see how to lock it down. Any thoughts?

    Read the article

  • VirtualBox Port Forward not working when Guest IP *IS* specified (while doc says opposite)

    - by Patrick
    Trying to port forward from host (Mac OS X) 127.0.0.1:8282 - guest (CentOS)'s 10.10.10.10:8080. Existing port forwards include 127.0.0.1:8181 and 9191 to guest without any IP specified (so whatever it gets through DHCP, as explained in the documentation). Here is how the non-working binding was added: VBoxManage modifyvm "VM name" --natpf1 "rule3,tcp,127.0.0.1,8282,10.10.10.10,8080" Here is how the working ones were added: VBoxManage modifyvm "VM name" --natpf1 "rule1,tcp,127.0.0.1,8181,,80" VBoxManage modifyvm "VM name" --natpf1 "rule2,tcp,127.0.0.1,9191,,9090" And by "non-working", I of course mean not listening (as a prerequisite to forwarding): $ lsof -Pi -n|grep Virtual|grep LISTEN VirtualBo 27050 user 21u IPv4 0x2bbdc68fd363175d 0t0 TCP 127.0.0.1:9191 (LISTEN) VirtualBo 27050 user 22u IPv4 0x2bbdc68fd0e0af75 0t0 TCP 127.0.0.1:8181 (LISTEN) There should be a similar line above but with 127.0.0.1:8282. Just to be clear, this port is listening perfectly fine on the guest itself. And when I remove the guest IP (i.e., clear the 10.10.10.10) the forward works fine, albeit to eth0 (not eth1 where I need it). I can tcpdump and watch the traffic flow back and forth. And yes, I've disabled iptables entirely while testing -- it's not getting blocked anywhere on the guest. As VirtualBox writes in their documentation, you are required to specify the guest IP if it's static (makes sense, no DHCP record it keeps): "If for some reason the guest uses a static assigned IP address not leased from the built-in DHCP server, it is required to specify the guest IP when registering the forwarding rule:". However, doing so (as I need to), seems to break the port forward with nary a report in any log file I can find. (I've reviewed everything in ~/Library/VirtualBox/). Other notes: While I used the above command to add the third rule, I've also verified it showed up correctly in GUI and then removed/re-added from there just to make sure). This forum link -- while very dated -- looks somewhat related in that a port forward to a static IP was not appearing (perhaps they think due to lack of gratuitous arp being sent for host to know IP is there/avail?). Anyway, what gives? Is this still buggy? Any suggestions? If not, easy enough workarounds? What's interesting is that this works perfectly fine on another user's Mac, however he's running a slightly older version (4.3.6 v. 4.3.12).

    Read the article

  • is there a way to run a command before puppet implements a change?

    - by Patrick
    I want to have puppet run a specific command before performing any type of change. I am aware of the prerun_command option in the main puppet.conf, but this is not what I'm looking for. I want the command to only run if something is about to change, not on every puppet run. Here's the scenario. Let's say I have a bunch of web servers behind a load balancer. I then want puppet to update the web site files. But in order to prevent issues where some files have been updated, but other files haven't, and the mixed versions causing problems, I want to take the server out of the load balancer pool. I could write a script which when run will tell the load balancer to remove the box from the pool. Then puppet can do the change, and use postrun_command to put the box back in the pool once complete. But I need a way to run that script to remove the server from the pool. The only solution I can think of is to keep 2 copies of the files on the box. One a staging copy, and when puppet updates that, use a notify action to trigger the removal script, and then copy from staging into the live location. But I was hoping for something a little more generic that would work on any change being performed (upgrading a package, restarting a service, creating a user, anything).

    Read the article

  • is it possible to have two web servers hosted at different places share the same domain name?

    - by patrick
    say I have a wordpress site: https://www.foobar.com and I want to have an entry point to a rails app at a certain subdirectory within that same domain: https://www.foobar.com/rails_app I know this is possible if both the wordpress app and the rails app are hosted on the same box, but is this in anyway possible if they are hosted on different boxes? I do not want to use subdomains because I am trying to allow ajax post requests from one to the other and not having to deal with single origin policy stuff.

    Read the article

  • Windows 8 not shutting down properly

    - by Patrick
    Since installing Windows 8, the computer hasn't been shutting down properly. When selecting to power down, the PC quickly displays the shutting down screen, the monitor powers off, and the computer remains on but unresponsive. After about 5 minutes, the computer will turn off. Upon booting into windows again, I am informed that Windows didn't properly shut down. I'm running a fast SSD, and it's a clean install of Windows 8, so there's no way Windows is taking that time to do some sort of hibernate on shutdown or whatever - not to mention the error when entering Windows the next time. This happens on every shut down. Restart works as expected. EDIT: Formatting again didn't work. Fails regardless of drivers installed. Event viewer Always these two messages in close succession: Error (event ID 6008): The previous system shutdown at 7:45:21 PM on ?27/?10/?2012 was unexpected. Critical (kernel power, event ID 41): The system has rebooted without cleanly shutting down first. This error could be caused if the system stopped responding, crashed, or lost power unexpectedly.

    Read the article

  • "Access is denied" when copying text file to printer UNC path

    - by Patrick
    We have a new server running Server 2008 R2. We also have a "DOS-based" program that prints directly to the UNC path of a print share. With the new server, we are unable to print from this program. According to support, the program's printing works in the same way as if we were to do a "copy mytextfile.txt \\myserver\myprinter". When we try to run this command in DOS, we get "Access is denied". Support is saying that this is why the DOS program is not able to print. I have tried granting all permissions on the printer to the appropriate users (under Security of the printer properties) but that did not work. Is there a policy setting that would cause this to be denied?

    Read the article

  • It is okay to set MASQUERADE at 2 network interfaces in a Linux server?

    - by Patrick L
    There is a Linux server with 3 network interfaces, eth0, eth1, eth2. IP forwarding has been turn on in this server. eth0 is connected to 10.0.1.0/24. Its IP is 10.0.1.1. eth1 is connected to 172.16.1.0/24. Its IP is 172.16.1.1. Server A can ping router C at 172.16.1.2. eth2 is connected to 192.168.1.0/24. Its IP is 192.168.1.1. Server A can ping server B at 192.168.1.2. Router C is able to route to 172.16.2.0/24 and 172.16.3.0/24. [10.0.1.0/24] | 172.16.2.0/24------| | [C]------172.16.1.0/24------[A]------192.168.1.0/24------[B] 172.16.3.0/24------| We have set MASQUERADE at eth0. When server B (192.168.1.2) connect to 10.0.1.0/24, IP MASQUERADE will happen at eth0. Can we set MASQUERADE at eth1? Is it okay to set MASQUERADE at more than 1 network interfaces in Linux?

    Read the article

  • ERROR: snapshot_root must be a full path

    - by Patrick
    I want to use rsnapshot to make backups of some folders on a remote server. I've already setup Key Based Authentication, and I've specified in rsnapshot.conf: snapshot_root [email protected]/ however I get the following error: ERROR: snapshot_root snapshot_root [email protected]/ - snapshot_root \ must be a full path So I was wondering if the only way is to mount first the remote server and how (I'm on Ubuntu 9.04) thanks

    Read the article

  • How can I set clean urls (enable rewrite) if I don't have a domain ?

    - by Patrick
    In order to enable clean urls in Drupal, I add the lines below to the lighttpd configuration file. However I'm now working on a local server and I don't have a domain available. So I need to work with this address http://local.ip/Sites/mywebsite I've tried to replace ["host"] with ["socket"] and replace the domain with ip and subfolders (see address above), but unsuccessfully. How can I set the configuration file to set clean urls even if I don't have a domain ? thanks $HTTP["host"] =~ "(^|\.)mywebsite\.com" { server.document-root = "/var/www/sites/mywebsite" server.errorlog = "/var/log/lighttpd/mywebsite/error.log" server.name = "mywebsite.com" accesslog.filename = "/var/log/lighttpd/mywebsite/access.log" include_shell "./drupal-lua-conf.sh mywebsite.com" url.access-deny += ( "~", ".inc", ".engine", ".install", ".info", ".module", ".sh", "sql", ".theme", ".tpl.php", ".xtmpl", "Entries", "Repository", "Root" ) # "Fix" for Drupal SA-2006-006, requires lighttpd 1.4.13 or above # Only serve .php files of the drupal base directory $HTTP["url"] =~ "^/.*/.*\.php$" { fastcgi.server = () url.access-deny = ("") } magnet.attract-physical-path-to = ("/etc/lighttpd/drupal-lua-scripts/p-.lua") }

    Read the article

  • How to specify Multiple Secure Webpages with .htaccess RewriteCond

    - by Patrick Ndille
    I have 3 pages that I want to make secure on my website using .htaccess -login.php -checkout.php -account.php I know how to make just one work page at a time using .htaccess RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} /login.php RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L] I and trying to figure out how to include the other 2 specific pages to make them also secure and used the expression below but it didn't work RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} /login.php RewriteCond %{REQUEST_URI} /checkout.php RewriteCond %{REQUEST_URI} /account.php RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L] Can someone help me the right expression that will work with multiple pages? The second part of the code is that, if https is already on and a user move to a page that Is not any of the pages i specified about, I want that it should get back to http. how should I write the statement for it to redirect back to http if its not any of the pages above? I have my statement like this but its not working RewriteCond %{HTTPS} on RewriteRule !(checkout|login|account|payment)\.php http://%{HTTP_HOST}%{REQUEST_URI} [L,R] Any thoughts?

    Read the article

  • Apprentice Boot Camp in South Africa (Part 2)

    - by Tim Koekkoek
    By Maximilian Michel (DE), Jorge Garnacho (ES), Daniel Maull (UK), Adam Griffiths (UK), Guillermo De Las Nieves (ES), Catriona McGill (UK), Ed Dunlop (UK) Today we have the second part of the adventures of seven apprentices from all over Europe in South-Africa!  Kruger National Park & other experiences Going to the Kruger National Park was definitely an experience we will all remember for the rest of our lives. This trip,organised by Patrick Fitzgerald, owner of the Travellers Nest (where we all stayed), took us from the hustle and bustle of Joburg to experience what Africa is all about, the wild! Although the first week’s training we had prior to this trip to the Kruger was going very well, we all knew this was to be a very nice break before we started the second week of training. And we were right, the animals, scenery and sights we saw were just simply incredible and like I said something we will remember for the rest of our lives. To see lions, elephants, cheetahs and rhinos and many more in a zoo is one thing, but to see them in the wild, in their natural habitat is very special and I personally only realised this from the early 5 am start on the first morning in the Kruger, which was definitely worth it. Not only was it all about the safari, we ate some wonderful food, in particular on the Saturday night, Patrick made us a traditional South African Braai which was one of my favourite meals of the whole two weeks. After the Kruger National Park we had a whole day of traveling back to Johannesburg but even this was made to be a good day by our hosts. Despite the early start on the road it was all worth it by the time we reached God's Window. The walk to the top was made a lot harder by all the steaks we had eaten in the first week but the hard walk was worth it at the top, with views that stretched for miles. The Food The food in South Africa is typically meat and in big amounts, while there we ate a lot of big beef steaks, ribs and kudu sausage. All of the meat we ate was usually cooked with a sauce such as a Barbeque glaze. The restaurants we visited were: Upperdeck Restaurant, with live music and a great terrace to eat, the atmosphere was good for enjoying the music and eating our food. Most of us ate  Spare ribs that weighed 600 kg, with barbecue sauce that was delicious. Die Bosvelder Pub & Restaurant is a restaurant with a very surprising decor, this is because the walls had many of south Africa’s famous animals on them. The food was maybe the best we ate in South Africa. Our orders were: Springbokvlakte Lambs' Neck Stew, beef in gravy and steaks topped with cheese and then more meat on top! All meals were accompanied by a selection of white sauce cauliflower, spinach and zanhorias. Pepper Chair Restaurant, where the specialty is T-Bone steaks of 1.4 kg, but most of us were happy to attempt the 1 kg. Cooked with barbecue sauce over the meat, it was very good!  The only problem was their size causing the  the meat to get cold if you did not eat it very fast! We’re all waiting for our 1.0 kg t-bone steak including our Senior Director EMEA Systems Support Germany & Switzerland: Werner Hoellrigl The Godfather Restaurant, the food here was more meat in abundance. We ate: great ribs, hamburgers, steaks and all accompanied with a small plates of carrot and sauteed spinach, very good. We had two great weeks in South-Africa! If you want to join Oracle, then check http://campus.oracle.com 

    Read the article

  • postmap: fatal: open database /etc/postfix/sasl_passwd.db: Permission denied

    - by James Benders
    Hi, I'm configurint Postfix to use external smtp. For this, I use the following tutorial: http://carlton.oriley.net/blog/?p=31 After following it, I found in the logs that /etc/postfix/sasl_passwd.db couldn't be read. The file didn't exist. I used postmap hash:/etc/postfix/sasl_passwd (http://postfix.state-of-mind.de/patrick.koetter/smtpauth/smtp_auth_mailservers.html) as root, but I get: postmap: fatal: open database /etc/postfix/sasl_passwd.db: Permission denied Why do I get this? Thanks.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >