Search Results

Search found 15651 results on 627 pages for 'setup'.

Page 106/627 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Can I use a Windows Server 2003 Domain Controller but my home router for DNS?

    - by NetworkingWannabie
    Hi All Probably easiest to start with a description of my current setup, which works (oh, and this is a home setup not an office or anything): I have an ADSL modem with a static IP address (192.168.128.1), and its DHCP capability is disabled. I have a permanently powered up Windows Server 2003 machine with a fixed IP (192.168.128.2) which provides my domain controller, dhcp, and dns. The default gateway for everything is my ADSL modem everything is setup to use the WS2003 machine as the primary DNS with the ADSL modem as Secondary DNS just in case the server goes down (everything includes the server itself). Lastly, just in case it's relevant, I have my DHCP leases set to infinite (or whatever the right term is). Everything is pretty hunky dory. Except, that is, for the fact that my server is ALWAYS on, and it isn't always used, so I'm burning juice that I don't need to - my server burns around 120W which isn't immense but isn't irrelevant either, so I'd like to put it into a stand-by state when it isn't being used (the more standby the better) and then get the clients to wake it up. Am I correct in assuming that this won't work at the moment - A given client would need an IP address to wake the machine up, and it needs to machine to be awake to get an IP - catch 22? Assuming I'm correct, can I move to using my router (which is always on) for DHCP? What impact will this have on DC and DNS? Alternatively, does anyone have a better way for me to achieve this? Can I get the server to wake up when it sees clients look for a DHCP server, etc? Wow, that came out longer than expected! Thanks for your help.

    Read the article

  • What is the 'best practice' for installing perl modules on Solaris/OpenSolaris?

    - by AndrewR
    I'm currently in the process of writing setup instructions for some software I've written that is implemented as a set of Perl modules. Having done this for various flavours of Linux, I'm now doing the same for Solaris/OpenSolaris (v10 only). Part of the setup process is to make sure that dependent Perl modules are installed. This has been pretty easy on Linux as the Perl modules I require tend to be within the distro's packaging system (eg yum install perl-Cache-Cache). This is not the case on Solaris so I'm working on setup instructions that use the CPAN module to fetch dependent modules (eg perl -MCPAN -e 'install Cache::Cache'). This works ok but there are known problems with modules that require things to be built with a C compiler. The problem is that the C Makefile generated assumes you're using Sun's compiler and uses command-line options not understood by gcc, which you may be using instead. Consulting teh Internetz has thrown up a number of solutions to this: Install and use Sun's compiler Use the perlgcc wrapper script Edit the makefiles by hand (yuk) All of these work. My question to those more familiar with Solaris than me is: Is one of these the 'best' or 'most commonly used' method?

    Read the article

  • AWS VPC ELB vs. Custom Load Balancing

    - by CP510
    So I'm wondering if this is a good idea. I have a Amazon AWS VPC setup with a public and private subnets. So I all ready get the Internet Gateway and NAT. I was going to setup all my web servers (Apache2 isntances) and DB servers in the private subnet and use a Load Balancer/Reverse Proxy to pick up requests and send them into the private subnets cluster of servers. My question then, is Amazons ELB's a good use for these, or is it better to setup my own custom instance to handle the public requests and run them through the NAT using nginx or pound? I like the second option just for the sake of having a instance I can log into and check. As well as taking advantage of caching and fail2ban ddos prevention, as well as possibly using fail safes to redirect traffic. But I have no experience with their ELB's, so I thought I'd ask your opinions. Also, if you guys have an opinion on this as well, would using the second option allow me to only have 1 public IP address and be able to route SSH connections through port numbers to respective instances? Thanks in advance!

    Read the article

  • How can I print from my lion mac mini to my windows XP, with simple file sharing?

    - by Jules
    I have quite a complicated setup, perhaps. And a lot of history on this issue, I'm hoping that I don't have to buy a new printer. I've got a HP Wireless USB Print Server, which requires client software, I can't just use it as an IP Printer. The HP software is pretty poor on the mac and is no longer supported and often locks up the printer server and takes some considerable effort to actually print something. Let alone if a windows machine attaches to it first. My printer is an Epson Stylus R285. However, the windows client software is fine and we can print from windows 7 / XP without problem. We have simple file sharing setup as this is the only way I could get windows XP to talk to windows 7. However, I can't seem to get my mac mini to connect as anything other than a guest to my xp machine, to connect to the shared printer. I'm not considering some kind of internet printing as this would seems the simplest solution. But I'm not sure what will work with my setup ?

    Read the article

  • Converting software RAID1 to RAID10 for /boot

    - by luckytaxi
    Array info: /dev/md0 - /dev/sda1 and /dev/sdb1 /dev/md2 - /dev/sda2 and /dev/sdb2 Partition info: /boot - /dev/md0 / - /dev/md1 I have two drives that are setup as RAID1 using software RAID on Redhat. I added two additional drives (same size) and I would like to conver the RAID1 to a RAID10. The problem I'm having is adding the last drive to the array. I've gotten as far as creating a RAID10 with two missing devices but as soon as I add the last drive, all hell breaks loose. It seems /dev/sda1 is the culprit. What I'm not too sure about is how to create the RAID10. I've tried the following mdadm --create /dev/md2 --level=raid10 --raid-device=4 /dev/sdc1 missing /dev/sdd1 missing I then proceeded to fail /dev/sdb1 from /dev/md0 and added that partition to /dev/md2. I proceeded to install the MBR on EACH partition since boot resides on /dev/sdx1 on each drive. As a test, all is well, I'm able to boot back into the system once I do a quick reboot. Now, when I go add the last drive /dev/sda1, it breaks. I attempted to install grub on /dev/sda1 and I get the following ... grub> root (hd0,0) /dev/sda root (hd0,0) /dev/sda Filesystem type is ext2fs, partition type 0xfd grub> setup (hd0) setup (hd0) Checking if "/boot/grub/stage1" exists... no Checking if "/grub/stage1" exists... no Error 2: Bad file or directory type At this point, the array is hosed I believe. I rebooted the server and it refuses to boot.

    Read the article

  • Password Authentication Fails - NTLMv2

    - by JMeterX
    Environment: Windows 2000 sp4 EDIT: Domain Controller with no trust setup with the Win2008 Server Windows XP machines Windows 2008 Server Netapp NAS Problem: We have a shared folder that resides on a NAS using a Windows 2008 AD for the authentication with the proper permissions setup. When the Windows 2000 machine tries to open the share residing on the Win2008 machine, it is prompted for a username and password. Upon entering the credentials it continuously re-asks for credentials. Important Details: The Windows 2000 machine can ping both the XP machines and the Windows 2008 Server The Windows 2008 machine is mandated to only use NTLMv2 The Windows 2000 machine was originally set to NTLM but was recently switched to NTLMv2 if negotiated for the purpose of trying to connect to the share. As I am sure it will come up, we are using Windows 2000 because of contractual obligations Questions: Why is password Authentication failing in this case? After setting a GPO for the Win2000 machine for it to use NTLMv2, do we need to reboot the machine for the changes to take affect? We used SECEDIT to update the GPOs without rebooting. UPDATE We checked both of the 2008 Domain Controllers to find an error code. We received: Microsoft_Auth_Package_V1_0 0xc000006a Event ID: 4776 I know this to be an authentication error via THIS article "The value provided as the current password is not correct" We know this password to be correct, but since these two domains (Win2000 & Win2008) do not have a trust setup what authentication account needs to be used? One that resides on the Win2000 hosted domain?

    Read the article

  • How to reinstall Windows 7 Embedded?

    - by Joshua Lim
    I need to reinstall Windows 7 Embedded on my server but I'm not able to do so despite repeated tries. I tried booting up the server with the Windows Embedded 7 Setup ISO attached (using IPMI) and I've also tried running setup.exe in the CDROM after Windows has booted up. Both methods fail. In the first case, the server simply reboots by itself after I selected "IBW" button. In the second case, the installer returns some files missing while installing. I'm sure my Windows Embedded 7 Setup ISO is correct, because earlier on, I used IBW on the same ISO to install Windows Embedded 7 onto the server. Of course, the C drive has empty when I first installed. What should I do? I read that the normal Windows 7 (not embedded version) installer allows you to reformat the C drive before re installing. There does not appear to be such an option for Windows embedded. Appreciate any tip. Thanks.

    Read the article

  • Nginx rewrite for link shortener + Wordpress pretty URLs

    - by detusueno
    Okay so I installed Nginx/PHP/MySQL/Wordpress via a online walk through, and it had me enter these rewrites to enable Wordpress pretty URLs: if (-f $request_filename) { break; } if (-d $request_filename) { break; } rewrite ^(.+)$ /index.php?q=$1 last; error_page 404 = //index.php?q=$uri; This is then included in the vhost for my domain. What I'm trying to do now is add some redirection/link shortner rewrites that will play nice with the setup I have in mind. I'd like to redirect "x.com/y" to "x.com/script.php?id=y" for all external links that I post. The Wordpress link setup right now has almost all internal links begin with "news" (x.com/news/post-blah, x.com/news/category/1, etc) BUT I also have a few root links that point to some internal content (x.com/news, x.com/start). I'm guessing that's going to cause some conflicts. What's the best approach to do this? I've never worked with Nginx (or any rewrite rules) but maybe I can distinguish between "x.com/news" and "x.com/news/" to allow it to play nice? I had a friend setup a working version of this in Apache and it'd be nice if I could get this up on Nginx again.

    Read the article

  • Subdomain not hitting new site?

    - by Abe Miessler
    I have an existing site at my.site.com and I would like to setup a staging subdomain (staging.my.site.com). I have my DNS setup and directing staging.my.site.com to my server, but for some reason the Web Site I created in IIS for it is not being hit. Instead when I go to staging.my.site.com it takes me to the original site. The site I created in IIS has a home directory that is totally different from the regular sites home directory. I have added one host header with the following information: IP Address: (All Unassigned) TCP port: 80 Host Header Value: staging.my.site.com I was under the impression that with the setup I described above hitting staging.my.site.com through a web browser would bring up the staging site, but it does not. Can anyone see what I am doing wrong? UPDATE: One thing I noticed is that in my A record I am mapping to the IP Address ( lets say 1.1.1.1 for this example). In the host headers for the main site (my.site.com) it has 1.1.1.1 as an entry. Is this normal? Could it cause the problem I am talking about?

    Read the article

  • XenServer VMs can't reach network

    - by toto
    I'm currently trying to setup a small cloud architecture , I'm using in the installation CloudStack 2.2.14 which need two node : a management server (as node1) to provision the cloud and a hyperviser XenServer 5.6 SP2 to host the VMs (as node2). I succeded to create both node1 and node2 into an ESXi 5 VMWare as VMs. So The ESXi 5 is hosting two VMs node1 + node2 , and node2 which is the XenServer will host also VMs (such as ubuntu or Centos). Both node1 and node2 can ping each other and can get the internet connection from Esxi5 ,but My problem is : that VMs into the node2(XenServer) can't reach the network (can't ping node1 or Esxi or get an internet connection but they can ping VMs IN the node2(XenServer). So I tried to: 1-Setup a DHCP server as node3 in ESXi5 and connect node2(Xenserver) to him , but always the VMs into to node2 can't reach the outer network. 2-Setup a DCHP server into node2 , but always the same problem. So , 1-is there any other configuration i'm missing in node2 (considering that I'm sure about DNS , GW , NETMASK configuration)?. 2-Is it the problem because i'm Creating VMs into node2(XenSever) which is a VM into ESXi 5 ?

    Read the article

  • ActiveDirectory user files aren't being shared? [on hold]

    - by Ryan
    I'm taking a class in college and my current project is to setup Active Directory. So I have two VMs, one is Windows Server 2008 R2 and the other is Windows 7. I setup the domain team15.net (this is the FQDN for the forest) on the W2008 server machine. And then I connected the Windows 7 machine to the team15.net domain, and now I can login to the administrator account on the W2008 machine by using TEAM15\Administrator for the username. However, any files that I add in Windows 7 while logged into TEAM15\Administrator are NOT shown in the Windows 2008 machine. Is this normal? It seems like any files I add to this user in the domain only exists for the machine that I'm currently using. Is it possible to change this so all files for all users are synced to all computers in the domain? If not then what are alternatives? I noticed that back in high school we also used domains but there was a shared drive E:\ that you had to store your files because if you put them on in the desktop instead then they would suddenly disappear once you logged out and back in. How can I setup a shared drive? and which computer would provide the storage for this drive?

    Read the article

  • Java website on Tomcat PHP website on Apache - how to get PHP web pages into Java web pages?

    - by Venkat
    We have a Java web application deployed on Tomcat. We also setup Apache and mod_proxy_ajp to route web requests (port 80/443) to Tomcat. We would like to deploy a PHP application on the same Apache server - probably under a subdirectory (/var/www/ourapp). Now we would like to access & display web pages from PHP application within web pages generated by Java application. Planning to implement Single Sign-on as well. Example: Web page from java has (JQuery Tabs) and we like to display the PHP web page within a tab while all other HTML comes from java application. Can you please give a overall picture of how to proceed about this? Mainly 1. how we should install/setup our PHP application on same Apache server which is used to route web requests to Tomcat? i.e. either setup sub domain or install in sub directory 2. How to bring PHP pages into present web pages (generated by java). Can we use AJAX requests or should go for Java PHP Bridge/ Querces such applications? Thank you for your time in advance. Regards.

    Read the article

  • Install Exchange 2013 with DSC

    - by Alain Laventure
    I tried to install Exchange 2013 with the resource windowsProcess in existing Exchange Configuration. All prerequisites are installed (the Exchange Organization still exists). This is my Resource section: WindowsProcess Exchange2013 { Credential=$credential Path= "C:\Sources\Cumulative Update 5 for Exchange Server 2013 (KB2936880)\Setup.exe" Arguments= "/mode:Install /role:Mailbox /IAcceptExchangeServerLicenseTerms /TargetDir:C:\EX2013" Ensure= "Present" } #End Filter } #End Node } # End configuration /* @TargetNode='TargetDSC02' @GeneratedBy=exadmin @GenerationDate=08/02/2014 08:16:03 @GenerationHost=SOURCEDSC02 */ instance of MSFT_Credential as $MSFT_Credential1ref { Password = "Password1"; UserName = "S05\\Exadmin"; }; Exadmin is a member of Orgaganization Management Group and it is also member of Domain Admin Group, to be able to install Exchange When I execute this resource , Exchange Installation Start but after 1 minute the installation stops with this error: Failed [Rule:GlobalServerInstall] [Message:You must be a member of the 'Organization Management' role group or a member of the 'Enterprise Admins' group to continue.] To be sure that the right is really the problem I create a special User with only Administrator right of the Exchange server and with no Exchange Permission I run manually on the new Exchange server .\Setup.exe /mode:Install /role:Mailbox /IAcceptExchangeServerLicenseTerms /Targetdir:C:\EX2013 And I got the Same error that with DSC. After I add my test user in the Organization Management Group and I run again manually .\Setup.exe /mode:Install /role:Mailbox /IAcceptExchangeServerLicenseTerms /Targetdir:C:\EX2013 And the Exchange 2013 installation finish without any error. That prove that the problem with DSC is Permission right.

    Read the article

  • Internet Problem: Wireless connected but no connection to internet

    - by Josh K
    Hey i have a interesting network setup on a laptop here and for some reason the internet isnt working. I am connected to a secure network via wireless router and taskbar says i am connected and with good signal strength but in my internet browser i cant connect to any websites, the error is: This webpage is not available. (Chrome) I am using Chrome, but websites dont work on IE either. Heres a little background on the setup i have. I have a Ethernet connected to the laptop with a static ip, and then i have the wireless setup with DHCP enabled. I am using the ethernet to connect to the network (for remote desktop) but the wireless for internet (to avoid the network firewalls). this set up has worked fine for a few months, but i cant figure out what is going on now. Might be worth it to note it is a Lenovo Thinkpad and i just uninstalled ThinkVantage Access Connections (as it was giving me ample problems prior to this one, which i consider a step up) Tried repairing connection as well, let me know if you guys have any ideas please! EDIT: Solved-Dead Modem in the server room.... Sorry guys didn't have access to that myself

    Read the article

  • Is Ubuntu a viable replacement of Windows XP for small enterprise environments?

    - by Alex. S.
    Hi all, I'm a newbie systems administrator, so any advice would be great. I would like to setup ubuntu 8.04 lts in a small office of consulting in management (around 50 workstations) instead of Windows XP. I would install MS Office 2007 via WINE (*). It would be a fresh installation, so the migration would be less of a pain. The new setup would also include a small server as document repository and a backup server by now. Later, I would install another goodies like a IM server, a document management solution, and whatnot collaborative tool. What do you advice in this scenario? Do you think is viable? Should I try to convince my managers this is a good idea? I consider myself as a fair experienced user in both systems, and I'm the only guy in charge of everything. I need to cut costs down, and I think that antivirus and antimalware software are a waste of money and time. Is this good idea?, or should I resign and try to lock down the Windows systems and install AV software? Is there anything else in this setup I'm not foreseeing? (*) The only catch in my test machine until now had been that Office SmartArt doesn't work properly, the rest of Office 2007 may seem ok.

    Read the article

  • phpmyadmin “Forbidden: You don't have permission to access /phpmyadmin on this server.”

    - by Caterpillar
    I need to modify the file /etc/httpd/conf.d/phpMyAdmin.conf in order to allow remote users (not only localhost) to login # phpMyAdmin - Web based MySQL browser written in php # # Allows only localhost by default # # But allowing phpMyAdmin to anyone other than localhost should be considered # dangerous unless properly secured by SSL Alias /phpMyAdmin /usr/share/phpMyAdmin Alias /phpmyadmin /usr/share/phpMyAdmin <Directory "/usr/share/phpMyAdmin/"> Options Indexes FollowSymLinks MultiViews AllowOverride all Order Allow,Deny Allow from all </Directory> <Directory /usr/share/phpMyAdmin/setup/> <IfModule mod_authz_core.c> # Apache 2.4 <RequireAny> Require ip 127.0.0.1 Require ip ::1 </RequireAny> </IfModule> <IfModule !mod_authz_core.c> # Apache 2.2 Order Deny,Allow Allow from All Allow from 127.0.0.1 Allow from ::1 </IfModule> </Directory> # These directories do not require access over HTTP - taken from the original # phpMyAdmin upstream tarball # <Directory /usr/share/phpMyAdmin/libraries/> Order Deny,Allow Deny from All Allow from None </Directory> <Directory /usr/share/phpMyAdmin/setup/lib/> Order Deny,Allow Deny from All Allow from None </Directory> <Directory /usr/share/phpMyAdmin/setup/frames/> Order Deny,Allow Deny from All Allow from None </Directory> # This configuration prevents mod_security at phpMyAdmin directories from # filtering SQL etc. This may break your mod_security implementation. # #<IfModule mod_security.c> # <Directory /usr/share/phpMyAdmin/> # SecRuleInheritance Off # </Directory> #</IfModule> When I get into phpmyadmin webpage, I am not prompted for user and password, before getting the error message: Forbidden: You don't have permission to access /phpmyadmin on this server. My system is Fedora 20

    Read the article

  • Does anyone know how to "tcpdump" traffic decrypted by Mallory MITM? [migrated]

    - by chriv
    I'm looking for some help in capturing network traffic that I can analyze in Wireshare (or other tools). The tool I'm using is mallory. If anyone is familiar with mallory, I could use some help. I've got it configured and running correctly, but I don't know how to get the output that I want. The setup is on my private network. I have a VM (running Ubuntu 12.04 - precise) with two NICs: eth0 is on my "real" network eth1 is only on my "fake" network, and is using dnsmasq (for DNS and DHCP for other devices on the "fake" network) Effectively eth0 is the "WAN" on my VM, and eth1 is the "LAN" on my VM. I've setup mallory and iptables to intercept, decrypt, encrypt and rewrite all traffic coming in on destination port 443 on eth1. On the device I want intercepted, I have imported the ca.cer that mallory generated as a trusted root certificate. I need to analyze some strange behavior in the HTTPS stream between the client and server, so that's why mallory is setup in between for this MITM. I would like to take the decrypted HTTPS traffic and dump it to either a logfile or a socket in a format compatible with tcpdump/wireshark (so I can collect it later and analyze it). Running tcpdump on eth1 is too soon (it's encrypted), and running tcpdump on eth2 is too late (it's been re-encrypted). Is there a way to make mallory "tcpdump" the decrypted traffic (in both directions)?

    Read the article

  • Windows 7 DVD doesn't boot up, neither does USB. :'(

    - by Manan Shah
    My problem is that i'm not able to install windows 7. Been trying to install this since past 1 week. The methods i've tried are: I have a windows 7 bootable DVD which doesnt boot up. (I've set BIOS to boot from DVD ROM first but it just won't boot from the DVD). Tried to install Windows 7 from the same DVD to a friend's PC and it worked. So the DVD has no issues. I tried to run 'Setup.exe' from within the DVD. The two options pop-up 'Check compatibility' and 'Install now'. On clicking install now, after sometime, an error is encountered with the message 'Windows was unable to create a required installation folder' error code:0x8007000D. I am running Windows XP Professional and there's only one user on the PC which is the Admin, so i do not know why is the setup not getting permissions. I've also uninstalled my antivirus, CD burning software, disabled firewall and disconnected all other devices, but its still the same. I tried to install it from a USB device by making it bootable but that too doesnt work. (Yes the mobo supports booting from the USB). The problem is that XP does not recognize a 'USB' device on boot. Rather it shows this USB stick as a removable 'Hard Drive'. Furthermore, i changed the order of Hard Drive boot to boot from this removable Hard Drive first, it still boots my existing OS. Is there anything else that can be done? Any help would be greatly appreciated. :) Please ask if any other information is required, this post is becomimg increasingly long to add any other details. PS: I want to dual boot windows 7 with my existing XP, but that would be after i manage to run the windows 7 setup in the first place. PPS: Please bare with any 'not-so-technical' terms, i am a beginner with this. Again, thank you for taking the time and trying to help, really appreciate it. :)

    Read the article

  • STOP 0x7b booting from iSCSI

    - by Michael
    Hi, I've a Windows 2008 SBS running. It boots of iSCSI. That setup worked for months until yesterday. I intended to reboot and gained a: STOP 0x0000007b INACCESSIBLE_BOOT_DEVICE and no idea why. My setup hasn't changed. No new controller, no new or changed iSCSI targets, no new Network Card or IP address changes. I had all Windows Updates on it. Last known good: same STOP. Allow unsigned drivers: same STOP. Safe mode (all variants): same STOP. Mount target from a client: works. Filesystem check fine. I booted of the SBS DVD but in computer repair options my target doesn't appear. When i choose setup the target appears. So, how can i diagnose what's going wrong? Any helpful tools? Any hints? Thanks in advance Michael

    Read the article

  • Microsoft Office 2003 applications crash on 'Save As' to a network mapped drive

    - by Archit Baweja
    Hey guys, so I'm not sure if it belongs on ServerFault forums so figured I'd ask here first because its a workstation/client side issue. I have a client where we have windows server 2003 setup, with windows xp professional setup on all the workstations. We've setup a 'domain' and all workstations logon to the domain (authenticated by the Windows Domain Controller), and in the logon script we map drives on to each workstation. Everything is working peachy except for one workstation, where when I open a file in excel from a mapped drive, it opens fine, but when I go to hit Save As, the Save As dialog pops and hangs up. I cannot perform any other action in excel. When I try cancel the Save As dialog, excel crashes. The mapped drive opens up fine in Windows Explorer. To further investigate this issue, I created a new blank text document on the network drive in Windows Explorer. I then opened it. Then hit save as, and the Save As dialog opened up fine and it would let me save the document. I repeated the above steps for a word document. However this time the Save As dialog hung/froze again. So I'd imagine its a Microsoft Office Issue. Any ideas?

    Read the article

  • which virtualization technology is right for me?

    - by Chris
    I need a little help with this getting this sorted out. I want to setup a linux virtual server that I can use to run both sever and desktop systems. I want a linux system that is minimalist in nature as all the main os will be doing is acting as a hypervisor. The system I'm trying to setup will be running a file server, windows 7, ubuntu 10.04, windows xp and a firewall/gateway security system. All the client OS'es accessing and storing files on the file server. Also all network traffic will be routed through the gateway guest os. The file sever will need direct disk access while the other guests can run one disk images. All of this will be running on the same computer so I wont be romoting in to access the guests OS'es. Also if possible I would like to be able to use my triple head setup in the guest OS'es. I've looked at Xen, kvm and virtualbox but I don't know which is the best for me. I'm really debating between kvm and virtual box as kvm seem to support direct hardware access.

    Read the article

  • Whats the best cloud backup solution for a small scale server envoirnment?

    - by nbv4
    I have a server that runs a postgres database that contains about 200MB of data. Currently I have a cron job setup on my home computer which: ssh's into my server runs a remote script which makes a backup of the database scp's that dump over to my local hard drive for storage. Each dump is 20MiB. does this every six hours (one months of backups is roughly 2GiB) The problem with this setup is that if my local machine goes down for whatever reason, no backups will be made. Also, I can't have the cron run from the server, because I can't have it scp'd to my local machine from my server (firewalls and all that crap). My local machine is running Ubuntu 10.04, and my server is Ubuntu 9.10 server edition. I looked into Ubuntu One, but currently it's gui-only. I also looked into dropbox, but it's a pain in the ass to get setup in linux without gui support. Amazon S3 looks good but it's not free (yet dirt cheap). Is there any other alternative that I should look into? I'd prefer something where I can just have my script dump the database into a directory, and have the backup service 'watch' that folder and sync accordingly. I can maybe also have my local machine sync to the cloud backup so I have even more redundancy, plus easy access to my backups for use in testing.

    Read the article

  • Netgear CG3000D new Modem/Router - Random High Ping

    - by justin.chmura
    Cox just recently came out and looked at my internet and decided that the modem I had was causing high latency issue. The speed was fine but the ping would spike to around 100 and over when gaming or putting a load more than browsing on the line. After they replaced it, it seems like I get better latency, but when it spikes, I get upwards of over 300 ping with like 500 jitter. I figured I would hit the serverfault universe before sending another email to Cox. I opted not to do the Cox setup as it was an extra $20 which I thought would have just setup the wireless (which I can handle). Is it a setting or something that I missed that needs to be setup? The firmware for the CG3000D is awful and not fun to use. I did change some hidden settings on the RgServices.asp page (I'll attach a screenshot). I've also heard that the Router/Modem combos are awful and that I should go back and just ask for a modem stand-alone. Any input is helpful. All screenshots: http://imgur.com/a/JX6qu#0

    Read the article

  • Dual DC Time Service

    - by poconnor
    I believe I'm having an issue with my Domain Controllers and Time Server. On my back up DC, I keep seeing a warning stating "The time service has stopped advertising as a time source because the local clock is not synchronized." Does this mean that my backup DC believes it's a Time Server? My PDC should be the time server and I have gone through setting up the PDC as the time server. I was not around for the original setup of the time server with the old PDC and Backup DC. But I believe the old PDC was the time server so I setup the new PDC as the new time server, when I decommissioned the old PDC. Is it possible that the Backup DC was setup as the time server and it still thinks it's suppose to be giving out time to everyone? Registry for PDC has NTP Registry for Backup has NT5D5 Results of w32tm /monitor Getting AD DC list for default domain... Analyzing:delayoffset from DC1.local..com Stratum: 4 delayoffset from DC1.local..com Stratum: 3 Warning: Reverse name resolution is best effort. It may not be correct since RefID field in time packets differs across NTP implementations and may not be using IP addresses. DC2.local..com[192.168.1.8:123]: ICMP: 1ms NTP: -0.6349491s RefID: DC1.local..com [192.168.1.9] DC1.local..com *** PDC ***[192.168.1.9:123]: ICMP: 0ms NTP: +0.0000000s RefID: wwwco1test12.microsoft.com [65.55.21.20]

    Read the article

  • Whats the best cloud backup solution for a small scale server environment?

    - by nbv4
    I have a server that runs a postgres database that contains about 200MB of data. Currently I have a cron job setup on my home computer which: ssh's into my server runs a remote script which makes a backup of the database scp's that dump over to my local hard drive for storage. Each dump is 20MiB. does this every six hours (one months of backups is roughly 2GiB) The problem with this setup is that if my local machine goes down for whatever reason, no backups will be made. Also, I can't have the cron run from the server, because I can't have it scp'd to my local machine from my server (firewalls and all that crap). My local machine is running Ubuntu 10.04, and my server is Ubuntu 9.10 server edition. I looked into Ubuntu One, but currently it's gui-only. I also looked into dropbox, but it's a pain in the ass to get setup in linux without gui support. Amazon S3 looks good but it's not free (yet dirt cheap). Is there any other alternative that I should look into? I'd prefer something where I can just have my script dump the database into a directory, and have the backup service 'watch' that folder and sync accordingly. I can maybe also have my local machine sync to the cloud backup so I have even more redundancy, plus easy access to my backups for use in testing. Edit: My server is a VPS, so what solution I end up using has to be 100% software based.

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >