Search Results

Search found 6630 results on 266 pages for 'everyone'.

Page 187/266 | < Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >

  • Batch copy gives errors, xcopy works fine

    - by ndm13
    I am writing a general file backup program. It searches the drive for files matching a set of types and then writes them to a folder on the desktop. I wrote it using xcopy on Windows XP but upon learning that xcopy was deprecated in favor of robocopy in Vista and newer, still wanting to maintain compatibility I decided to switch to the non-deprecated copy. This is where the problems begin. I'm trying to fix the copy routine. I thought I had everything sorted out, but it doesn't copy anything. My output is zero files copied for every iteration. Original Code using xcopy: for /r %%a in (*.bmp *.dds *.gif *.jpg *.jpeg *.png *.psd *.pspimage *.tga *.thm *.tif *.tiff) do ( echo f | xcopy "%%a" "%HOMEDRIVE%%HOMEPATH%\Desktop\LDR\Images\Bitmap\%%~nxa" /q /y /g /c ) Revised (broken) Code using copy: for /r %%a in (*.bmp *.dds *.gif *.jpg *.jpeg *.png *.psd *.pspimage *.tga *.thm *.tif *.tiff) do ( copy "%%a" "%HOMEDRIVE%%HOMEPATH%\Desktop\LDR\Images\Bitmap\%%~nxa" /d /y /z ) Output: The system cannot find the path specified. 0 files copied. I know that it seems everyone uses either xcopy or robocopy but can anyone help with copy? Note: I'm using Batch to keep it very lightweight and command-line accessible.

    Read the article

  • WBadmin script to backup System State into another partition failed ?

    - by Albert Widjaja
    Hi Everyone, I tried numerous time but it is still failed like the following script on cmd prompt. Is there any way to create automated script to backup system state on my Windows Server 2008 using WBAdmin into D drive inside a directory ? any help would be greatly appreciated. C:\Users\Administratorwbadmin START SYSTEMSTATEBACKUP -backuptarget:D:\Admin\Backup wbadmin 1.0 - Backup command-line tool (C) Copyright 2004 Microsoft Corp. Starting System State Backup [12/02/2011 4:22 AM] Retrieving volume information... This would backup the system state from volume(s) SYS(C:) to D:\Admin\Backup. Do you want to start the backup operation? [Y] Yes [N] No Y ERROR - Specified backup location could not be found. C:\Users\Administratorwbadmin START SYSTEMSTATEBACKUP -backuptarget:D:\Admin wbadmin 1.0 - Backup command-line tool (C) Copyright 2004 Microsoft Corp. Starting System State Backup [12/02/2011 4:22 AM] Retrieving volume information... This would backup the system state from volume(s) SYS(C:) to D:\Admin. Do you want to start the backup operation? [Y] Yes [N] No Y ERROR - Specified backup location could not be found. C:\Users\Administratorwbadmin START SYSTEMSTATEBACKUP -backuptarget:"D:\Admin\Backup" -quiet wbadmin 1.0 - Backup command-line tool (C) Copyright 2004 Microsoft Corp. Starting System State Backup [12/02/2011 4:23 AM] Retrieving volume information... This would backup the system state from volume(s) SYS(C:) to D:\Admin\Backup. ERROR - Specified backup location could not be found. C:\Users\Administratorwbadmin START SYSTEMSTATEBACKUP -backuptarget:D:\ -quiet wbadmin 1.0 - Backup command-line tool (C) Copyright 2004 Microsoft Corp. Starting System State Backup [12/02/2011 4:23 AM] Retrieving volume information... This would backup the system state from volume(s) SYS(C:) to D:. ERROR - Specified backup location could not be found. C:\Users\Administrator

    Read the article

  • Populating a drive that was formated to ntsf on seperate computer

    - by Will Love
    I recently downloaded some files that were over 4gb and wouldnt do a straight transfer to a external hd. this is all being done on a serperate computer we shall call computer1. my computer(computer2) is where im trying to get the files too. i used cmd to "Soft format" the drive to NTSF on computer1 so i could then take the drive to transfer the files to computer2. this was suppose to allow the format change without losing the files i had on there already... after the proccess was done i checked the drive on computer1 and all the old files were there and working....when i took the drive back to computer2 to transfer the files i had just downloaded from computer2 my computer...computer1:wont reconize or populate the external drive so it can give it a drive letter and function. the external drive works fine on computer1, but how do i get it to work again on computer2? when i try to populate it, i cant access the properties function in order to make permissions for everyone so it is reconized....any help would be greatfull..... also i am running windows 7 ultimate edition if that helps.

    Read the article

  • adding or routing additional domain email addresses

    - by Mustafa Ismail Mustafa
    We have exchange 2007 and we bought a new domain name and we're still keeping the old one so that we can wean everyone off of the old emails. Now, I'm wondering how to go about this. I need to add the new domain as accepted and authoritative by the exchange server. Emails on the new domain need to get routed to the inbox and ditto the old emails, however, I want to be able to change the reply-to in the header to the new email address automatically. I also want to set the new email addresses as the defaults. Ideally, I'd like to be able to add a message at the bottom of every externally outgoing email saying that the new email is [email protected]. But this is a nice to have, certainly not a must have. I've added the new domain as authoritative, and managed to change the primary smtp email addresses to the new one, but sent emails are not being routed to them and neither are the old email addresses! Now how the heck would I go about fixing all of that? I'm completely stumped! TIA

    Read the article

  • Prevent Roaming profiles from syncing certain elements

    - by user29919
    Hello everyone, I'm somewhat new to the Server 2008 front, and I'm afraid I've hit my first snag: I've set up roaming profiles, and they appear to be working too well. Is there a way to limit, ideally on a folder/object basis, what gets synced with a roaming profile? What I'm trying to do is: 1) stop my roaming profile from syncing desktop layout - I run a dual-screen desktop and a laptop, and it's really annoying to have to reposition everything after logging onto the laptop, because it forces everything onto one screen. 2) stop it from syncing registry variables - specifically, I want Visual Studio to load different setting files on each computer. Currently, the variable that contains that path is getting synced whenever I log in, so I get the settings from whatever box I last logged out from. 3) stop syncing the start menu - this one's not as big, but I'm noticing 'program not found' icons even for programs that are installed. they work when I click them - they just look ugly. I'm running Windows SBS 2008 x64 with two Win7 clients (x86 Pro, and X64 Ultimate). Is there a simple way to do that? Or am I trying to work too much against what roaming profiles are designed for? I could, of course, set up different profiles for the desktop and laptop, but that seems to defeat the point of roaming profiles entirely... Thanks in advance! Any help will be much appreciated =)

    Read the article

  • Using IIS7 as a reverse proxy

    - by Jon
    My question is pretty much identical to the question listed but they did not get an answer as they ended up using Linux as the reverse proxy. http://serverfault.com/questions/55309/using-iis7-as-a-reverse-proxy I need to have IIS the main site and linux (Apache) being the proxied site(s). so I have site1.com (IIS7) site2.com (Linux Apache) they have subdomains of sub1.site1.com sub2.site1.com sub3.site2.com I want all traffic to go to site1.com and to say anything that is site2.com should be proxied to linux box on internal network, (believe ARR can do this but not sure how). I can not have it running as Apache doing the proxying as I need IIS exposed directly. any and all advice would be great. EDIT I think this might help me: <rule name="Canonical Host Name" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTP_HOST}" negate="true" pattern="^cto\.com$" /> <add input="{HTTP_HOST}" negate="true" pattern="^antoniochagoury\.com$" /> <add input="{HTTP_HOST}" negate="true" pattern="www.antoniochagoury\.com$" /> </conditions> <action type="Redirect" url="http://www.cto20.com/{R:1}" redirectType="Permanent" /> </rule> from: http://www.cto20.com/post/Tips-Tricks-3-URL-Rewriting-Rules-Everyone-Should-Use.aspx I will have a look at this when I have access to the IIS7 box. Thanks

    Read the article

  • Datacenter IP Addressing and DNS Management

    - by user65248
    Hello everyone Basically we are setting up a small Datacenter, about 300 amps power and max 50 racks, Im saying these coz I wanna u imagine the size and requirements, I have studied networking mostly Microsoft and Windows based systems , but I cant get how the IP addressing and DNS management and configuration works in a Datacenter , and unfortunately I have to setup everything by myself but defe we will have some staff to do some job. Now my questions Datacenter IP Addressing Suppose we have got a block of 200 IP addresses from our ISP, How can I manage these block of IP addresses, is there any software out there to simplify this I heard that using DHCP server in a datacenter is not recommended, otherwise what would u say about MS DHCL server ofc considering we need to have backup serversin case of failur How can I assign a block of IPs to a specific rack, I know with different software and management its different but Im asking how it is done normally IP addresses are exposed to the whole network, what if a customer try to use an IP address and is not assigned to their server or rack , how can I prevent this or how can I track the IP usage DNS Management Im goin to setup at least two servers for our DNS servers, I know nothing about Datacenter DNS system, but I have configured DNS server in normal networks and also for webservers, Now I wanna know What exactly needs to be done for a DNS in a datacenter that is not done for normal networks. How can I configure PTR records why cant I configure PTR records on my webserver side DNS server and it should be done on datacenter DNS server , I mean what is the difference in DC DNS servers that allow us to to so , I know the question is very silly and simple but Im confused Is there any software outthere to allow doing the whole thing, I mean automatically add records to the DNS and also managin IP addresses !? Thanks in advance

    Read the article

  • Can a hardware firewall block a server accessing its OWN UNC shares?

    - by Simon
    I need to set up a UNC share for my hosted dedicated server to access a share on itself. Unfortunately TFS requires a UNC share. I am on a Windows Server 2008 Standard SP2 64bit dedicated server behind a PIX 501 firewall hosted with GoDaddy. I just cannot get the server to access itself and get this error: Windows cannot access \\SERVER\SHARE Check the spelling of the name.. etc. I've found numerous questions about this but no answer to my problem. Server 2008 Standard x64 SP2 Workgroup - not domain Windows Firewall is off Computer browser service is on I am trying to access \\MYMACHINE\TFS-BUILDS by typing in - or double clicking. Neither works. Machine has single network card Filesharing wizard says share was ok Share was showing under 'Computer management' Permissions are set to 'everyone' full control No obvious errors in eventlog Reboot didn't fix it Unfortunately I cannot try to access other shares in or out of this machine because it is a hosted dedicated server and the only machine behind a hardware firewall. The only thing left i can think of is that the hardware firewall needs to be configured. Is this possible? Does 'UNC traffic' go out of the machine and then back in again?

    Read the article

  • Installation of Active Directory on separate VM from DNS does not entierly work - not sure why

    - by René Kåbis
    Not sure what I am doing wrong here. I have a moderately midrange server (16 cores, 2Ghz, 32GB ECC REG RAM, 6TB storage, nothing too extreme) where I am running Hyper-V (Server 2012 R2 Enterprise) in order to provision virtual machines. So why an AD separate from DNS? I want redundancy. I want to be able to move VMs and back them up individually and not have too many services on any one VM. I have already provisioned a VM with DNS, and have set it up right -- essentially, I have: Set up Static IP’s for everyone involved. Installed the DNS service on the DNS VM. Created a forward lookup zone and a reverse lookup zone (primary zone) xyz.ca Configured the zones to use nonsecure and secure dynamic updates (i will change this to secure later after the domain controller is online). Created a A record for the DC in the forward lookup zone (and a reverse ptr) Changed DC’s DNS server (network settings) to the new DNS server. Checked that I can ping the dns server from the new DC by hostname. When I went ahead and did a DCpromo on the DC, and un-cheked the “install DNS” option, everything seemed to go well (no error messages), but I saw no changes on the DNS server whatsoever (no additional settings). Plus, the DNS server seems to be unable to join the domain, as it claims that the domain is not discoverable. As a final note, I do run Symantec Endpoint Protection, which includes a firewall and most settings set as default. I have not yet tried turning this off, but my experience has been that if a service would open up a port on a Windows firewall, it would do the same through Symantec. There is pretty tight integration these days with corporate-class AV and Windows. I have a template vhdx fully set up (just short of any special roles and features) that I can use to replace the current AD VM with, so doing this all over again is not too much skin off of my nose.

    Read the article

  • What folders to encrypt with EFS on Windows 7 laptop?

    - by Joe Schmoe
    Since I've been using my laptop more as a laptop recently (carrying it around) I am now evaluating my strategy to protect confidential information in case it is stolen. Keep in mind that my laptop is 6 years old (Lenovo T61 with 8 GB or RAM, 2GHz dual core CPU). It runs Windows 7 fine but it is no speedy demon. It doesn't support AES instruction set. I've been using TrueCrypt volume mounted on demand for really important stuff like financial statements forever. Nothing else is encrypted. I just finished my evaluation of EFS, Bitlocker and took a closer look at TrueCrypt again. I've come to conclusion that boot partition encryption via Bitlocker or TrueCrypt is not worth the hassle. I may decide in the future to use Bitlocker or TrueCrypt to encrypt one of the data volumes but at this point I intend to use EFS to encrypt parts of my hard drive that contain data that I wouldn't want exposed. The purpose of this post is to get your feedback about what folders should be encrypted from the general point of view (of course everyone will have something specific in addition) Here is what I thought of so far (will update if I think of something else): 1) AppData\Local\Microsoft\Outlook - Outlook files 2) AppData\Local\Thunderbird\Profiles and AppData\Roaming\Thunderbird\Profiles- Thunderbird profiles, not sure yet where exactly data is stored. 3) AppData\Roaming\Mozilla\Firefox\Profiles\djdsakdjh.default\bookmarkbackups - Firefox bookmark backup. Is there a separate location for "main" Firefox bookmark file? I haven't figured it out yet. 4) Bookmarks for Chrome (don't know where it's bookmarks are) and Internet Explorer ($Username\Favorites) - I don't really use them but why not to secure that as well. 5) Downloads\, My Documents\ and My Pictures\ folders I don't think I need to encrypt, say, latest service pack for Visual Studio. So I will probably create subfolder called "Secure" in all of these folders and set it to "Encrypted". Anything sensitive I will save in this folder. Any other suggestions? Again, this is from the point of view of your "regular office user".

    Read the article

  • Apache directive for authenticated users?

    - by Alex Leach
    Using Apache 2.2, I would like to use mod_rewrite to redirect un-authenticated users to use https, if they are on http.. Is there a directive or condition one can test for whether a user is (not) authenticated? For example, I could have set up the restricted /foo location on my server:- <Location "/foo/"> Order deny,allow # Deny everyone, until authenticated... Deny from all # Authentication mechanism AuthType Basic AuthName "Members only" # AuthBasicProvider ... # ... Other authentication stuff here. # Users must be valid. Require valid-user # Logged-in users authorised to view child URLs: Satisfy any # If not SSL, respond with HTTP-redirect RewriteCond ${HTTPS} off RewriteRule /foo/?(.*)$ https://${SERVER_NAME}/foo/$2 [R=301,L] # SSL enforcement. SSLOptions FakeBasicAuth StrictRequire SSLRequireSSL SSLRequire %{SSL_CIPHER_USEKEYSIZE} >= 128 </Location> The problem here is that every file, in every subfolder, will be encrypted. This is quite unnecessary, but I see no reason to disallow it. What I would like is the RewriteRule to only be triggered during authentication. If a user is already authorised to view a folder, then I don't want the RewriteRule to be triggered. Is this possible? EDIT: I am not using any front-end HTML here. This is only using Apache's built-in directory browsing interface and its in-built authentication mechanisms. My <Directory> config is: <Directory ~ "/foo/"> Order allow,deny Allow from all AllowOverride None Options +Indexes +FollowSymLinks +Includes +MultiViews IndexOptions +FancyIndexing IndexOptions +XHTML IndexOptions NameWidth=* IndexOptions +TrackModified IndexOptions +SuppressHTMLPreamble IndexOptions +FoldersFirst IndexOptions +IgnoreCase IndexOptions Type=text/html </Directory>

    Read the article

  • i7 overclocking problem

    - by benwebdev
    Hi everyone, I've got an overclocked system that seems to be misbehaving. When trying to boot up from cold the system just hangs and nothing is output to the screen. Fans are on as they should but nothing happens to finish the boot. From here I have to switch it off then on again and the boot completes. If I go into the tweak menu in the BIOS I'm informed that a boot has failed I've been in touch with Overclockers UK support a bit and theres not yet been a solution. We've mainly been tweaking the voltage for the CPU. Any suggestions? I'm new to Overclocking which is why I got a bundle with OCUK. With this issue happening on Cold Boot too its tricky to test as I have to make a change then wait till the next day. My system is here: Intel Core i7 930 2.80Ghz overclocked to 4GHz Gigabyte X58A-UD3R (BIOS Version: FC) 6GB RAM Power Supply - CoolerMaster Silent Pro M series 700W One suggestion made by OCUK was that maybe its the power supply but I'm not sure and dont have a spare - it's brand new and a pretty expensive piece of kit. Any thoughts on this? Other recomendations for Power? thanks Ben

    Read the article

  • How to enable Ipv6 on my ubunutu 11.04 virtual machine

    - by liv2hak
    I have installed 3 VM's on my PC.(Ubuntu 11.04).I want to setup an IPV6 network to review and test some of the IPV6 tools like NDPMonitor.(monitors ICMP messages of Neighbour Discovery Protocol.) IP v6 addresses are as follows. linux_router - fe80::a00:27ff:fed5:f7e9/64 labhack1 - fe80::a00:27ff:fed2:8bd1/64 labhack2 - fe80::a00:27ff:fed7:2f2d/64 The below commands have been run on both linux_router and labhack1. sudo ip r a 2001:468:181:f100::/64 dev eth0 sudo vim /etc/radvd.conf /file looks like below./ interface eth0 { AdvSendAdvert on; /*means that we are sending out advertisments.*/ MinRtrAdvInterval 5; /*these options control how often advertisments are sent*/ MaxRtrAdvInterval 15; /*these are not mandatory but valueable settings.*/ prefix fe80::a00:27ff::/64 { AdvOnLink on; /*Says to the host everyone sharing this prefix is on the sam local link as you.*/ AdvAutonomous on; /*Says to a host: "Use this prefix to autoconfigure your address"*/ }; }; sudo sysctl -w net.ipv6.conf.all.forwarding=1 I try to do a ping6 -I eth0 fe80::a00:27ff:fed5:f7e9 I get Destination unreachable: Address unreachable.I am not sure what I am doing wrong here.I am a beginner at linux administration.Basically I think I am missing whatever is analogous to physically connecting VM's.Any help that would point me in the right direction would be greatly appreciated.

    Read the article

  • Windows XP seemingly out of resources but plenty of free RAM and swap available

    - by Artem Russakovskii
    This one has been bothering me for years and so far I couldn't find an adequate solution. The problem occurs on pretty much every XP install I've done. After opening a variety of programs or the system running existing programs for a while, Windows seemingly runs out of resources, without telling me. There's ALWAYS free RAM. For example, it just happened to me and I had over a gig of free RAM. There are no viruses, spyware, or other nonsense - it is a Windows resource problem, but the question is which resource is it running out of, how does one pinpoint it, and how does one prevent it? Sometimes, this happens after running specific programs - for example, today it happened when I started Photoshop CS4 and Flash CS4 at the same time. I also noticed that restarting The Bat (email client by Ritlabs) seems to get rid of this problem for a while but again, this happens on machines that don't even have The Bat installed. So what does exactly happen? The symptoms are: pressing alt-tab doesn't bring up the list anymore - it just jumps to the next window instantly, very similar to the way Alt-Esc works, however in this case, it's due to not having enough resources to bring up the alt-tab menu random programs would randomly crash, citing random errors, out of memory errors, system resources, inabilities to do system calls, etc. random programs would start missing random parts - for example, Firefox top menus might disappear, pull up partial selections, or not pull up anymore altogether. IE might lose a few of its toolbars. Some programs might fail to redraw or would just plain go gray where the UI used to be. Windows itself never complains about running out of RAM, virtual memory, or anything at all, yet it's running out of something. The only clue I was able to find and apply the fix today was this Desktop Heap Limitation. I haven't confirmed the fix working as not enough time passed. In the meantime, what are everyone's thoughts?

    Read the article

  • Sync Two Exchange accounts or Ready Only access to subfolders

    - by cpgascho
    This is two questions kind of. The situation is as follows. I am running SBS 2008 with Exchange 2007. There is a shared account which has subfolders to keep track of the process of jobs that are coming into the company (ie: sales) I need to give other people in the company read access to this mailbox not full control. When I give ready only access to the root other users can only see the Inbox and not subfolders. Permissions have to be applied to each folder. One solution I have considered is creating a secondary mailbox that everyone could have full access too which would have a one way sync from the sales mailbox to the secondary mailbox. Then people could see what was happening without messing up the main mailbox by accident (at worst they would mess up the secondary mailbox) Ideally I could find a way to propgate the READ ONLY Permissiosn to all the subfolders. I have tried using PFDavAdmin to do this but have not been able to get it to connect successfully from Windows 7 To Exchange 2007 Any idea on how to 1. Propogate permissions (get PFDavAdmin to work??!) 2. Sync mailboxes 3. Other solution? Thanks Chris

    Read the article

  • Why can't this user connect to domain share?

    - by Saariko
    Part of my reorganizing credentials in the domain, I have created several users that will be used solely for services (backup, LDAP, etc) The idea is that systems that need specific usage will use a user/service user, that will give them what they need. However, I am having trouble setting the correct needed data. For this example, I have a NAS (Ready NAS 1100 by Netgear), that runs it's own backup jobs. The job reads from a domain share: \domain\qa and copies all data to another location. When using the domain\administrator everything works. When I input the domain\srv.backup user I get an error connecting to the folder. The srv.backup is part of the 'Domain Admins' group, which is a member of 'Administrators' I thought there might be propagation issues, but even when the srv.backup user was a direct member of 'Administrators' the error still occurred. I have 2 DC's (W2K8R2 replicas) - I thought that could also cause a problem, AFAIKT it's not the issue. Sharing permissions are open to everyone The Security on the folder is as follow This is the test window from the NAS dashboard I doubled check that the 'srv.domain' is part of the 'Domain Admins' group As well as tried with a simple 1-9 password. What else do I need to check? thanks.

    Read the article

  • Gray Progress Bar in Macbook Boot caused by Partition Fault

    - by Konstantin Bodnya
    Here's my problem: When I try to load my macbook it shows the gray progress bar. It takes a while to fill the whole progress and after it macbook just shuts down. I tried to boot from recovery partition and run Disk Utility to repair it. Disk Utility showed my "Macintosh HD" in a gray color and failed to repair it. I thought all my data was lost, but then I tried the following: So I booted into ubunto from live usb and it successfully mounted my macintosh hd hfs+ partition. Parted shows me the following partitions on my disk: Disk /dev/sda: 500GB Sector size (logical/physical): 512B/512B Partition Table: gpt Number Start End Size File system Name Flags 1 20.5kB 210MB 210MB fat32 EFI System Partition boot 2 210MB 499GB 499GB hfs+ Macintosh HD 3 499GB 500GB 650MB hfs+ Recovery HD Seems legit except for FAT32 for EFI System Partition. Since all my data is okay and backed up what should I do to recover the system. I don't really want to reinstall all the system though I believe there's a command to make it allright in linux. Thank you everyone!

    Read the article

  • Massive SQL issue shutting down our site.

    - by Pselus
    Our website has started timing out like crazy today. All of our clients are finding it unusable. The only error we can seem to trace down as a potential problem is this: SQLAllocHandle on SQL_HANDLE_DBC failed Error ASP Description Error Category Microsoft OLE DB Provider for ODBC Drivers I have no idea what it means or how to go about fixing it. Anyone ever encountered this error before? Currently, you can log in to our site, but then once you go to do anything else, you find yourself logged out or nothing happens. We have a lot of Ajax going on so the "nothing happens" probably has to do with the Ajax pages not loading properly due to logouts and so nothing displays to the user. Like I said, I'm at a loss. Anyone have any advice on this error? EDIT I realize that this isn't necessarily a programming question, but we are a small startup company that just yesterday started talking about how we need to get a backup server running. Apparently we talked about it too late. We don't have a DBA, just 2 mid level programmers trying their hardest to keep our clients happy. So please, if you have any assistance give it but please don't close my question right now. EDIT 2 Turns out we had something on our server running called "ServerMask" that makes our IIS server look like Apache to the outside world. Shutting it down fixed our issue. Still no idea why it was messing up but it was the problem apparently. Thanks to everyone who tried to help.

    Read the article

  • Redirect specific domains with DNS

    - by user66377
    Currently we filter internet content using OpenDNS, our internal Windows DC/DNS servers point to the router's DNS, which then points to the OpenDNS servers. This works well to block all computer's on the network equally. New issue. We now need to separate what computers can go to what sites. So facebook is blocked for everyone right now, but I need to open it up to the 3 community computers now. The 3 community computers will be on an untrusted network seperate from the company computers so they can have their own DNS server, from their own router. The issue is though they still must connect to the internet using the same IP address. So OpenDNS sees the same IP and blocks them the same way. We are looking into getting a second IP, but it's not likely an option without going up to the next major level with our ISP which we don't want to do. My thought is this. Can I setup a DNS server on the untrusted network, and then depending on the request that comes in, have it send it to either OpenDNS or our ISP's DNS? Example www.facebook.com and www.youtube.com are both on the OpenDNS blacklist. So if they go to www.youtube.com, the local DNS server goes to the ISP's DNS to get the IP and thus the client gets the right IP and can go to the site. This would be manually entered for each allowed site thus creating a white list. Then if they go to www.facebook.com, since the local DNS server does not find an entry, it sends the request to OpenDNS, which then sees the site is on the blacklist, and thus sends the it's blocked webpage. The local DNS server can be either Bind on Linux or MS DNS on Window 2008. If this can be done, can you give some direction as well as I've never setup a DNS such as this before. Thanks

    Read the article

  • Cyrus: In practical terms, how do end users administer their shared mailboxes?

    - by Nick
    Let's say we have four customer service reps: Billy, Bob, Joe, and Tom. Tom is the department manager. There's a shared Customer Service mailbox on the Cyrus server that they all have access to. Tom, as the manager also has administrative privileges for the shared mailbox. They decide they want to create sub-folders a certain way, and Tom creates them. They're all running Thunderbird, so Tom right-clicks the main folder and chooses "New Subfolder". Now Tom has the Subfolders he needs and the other sales reps have... nothing! Because Cyrus created the Subfolders giving Tom "Full Access" permissions, and everyone else gets no access. So how does Tom give the other reps in his department access to the new folders? As far as Cyrus is concerned, Tom has permission to grant others access to his new mailboxes- But as far as I can tell, there's no option in Thunderbird for granting mailbox permissions. An IT staff member should not have to receive a support request every time someone wants to add a Subfolder to a shared mailbox. That's why we make certain users into mailbox admins in the first place! But asking (non-technical) users to SSH into an IMAP server to run cyradm seems like a bad idea too. Certainly someone has found a solution for this dilemma. Perhaps a Thunderbird extension for setting Cyrus permissions? Or something like umask that forces subfolders to have identical permissions to their parents on creation? And related, what about Sieve configuration? Is there anyway that can be done from the client machine too? Thanks, Nick

    Read the article

  • copy large LVM volume(14TB) from one server to another

    - by bruce
    recently,I have to copy a very large LVM volume()rom server A to server B. Below is the filesystem of server A and server B - server A [root@AVDVD-Filer ~]# df -h Filesystem Size Used Avail Use% Mounted on /dev/mapper/vg_avdvdfiler-lv_root 16T 14T 1.5T 91% / tmpfs 3.0G 0 3.0G 0% /dev/shm /dev/cciss/c0d0p1 194M 23M 162M 13% /boot /dev/mapper/vg_avdvdfiler-test 2.3T 201M 2.1T 1% /test /dev/sr0 3.3G 3.3G 0 100% /mnt server B [root@localhost ~]# df -h Filesystem Size Used Avail Use% Mounted on /dev/mapper/VolGroup-LogVol00 20G 2.5G 16G 14% / tmpfs 3.0G 0 3.0G 0% /dev/shm /dev/cciss/c0d0p1 194M 23M 162M 13% /boot /dev/mapper/VolGroup00-LogVol00 16T 133M 15T 1% /xiangao/lv1 /dev/mapper/VolGroup00-LogVol01 4.7T 190M 4.5T 1% /xiangao/lv2 I want to copy LVM volume /dev/mapper/vg_avdvdfiler-lv_root on server A to LVM volume /dev/mapper/VolGroup00-LogVol00 on server B . The server A and server B is in the same IP segment. IN the LVM volume on server A , there is all average 500M avi wmv mp4 etc. I tried mount /dev/mapper/vg_avdvdfiler-lv_root on server A to server B through NFS , then use cp command copy. It is clear I faild . Because the LVM volume is too big , I do not have good idea . I hope a good solution here. I'm a chinese, my english is very pool. sorry thanks everyone!

    Read the article

  • 3 Servers, is this is a cluster?

    - by Andy Barlow
    Hello, At the moment I have one Ubuntu server, 9.10, running with a simple Samba share, a mail server, DNS server and DHCP server. Mostly its just there for file sharing and email server. I also have 2 other servers that are exactly the same hardware and spec as the first, which have an rsync set up to retrieve the shared folders and backs them up. However, if the first server goes down, all of our shares disappear along with our mail and the system must be rebuilt. Also I tend to find if people are downloading a large amount from the file server, no-one can access there emails - especially in the morning when everyone is signing in at once. Would it be more beneficial for me to have all 3 servers, all running the same services, doing the same thing with some sort of cluster with load balancing? I'm not really sure where to begin looking, or how to go about such a setup where 3 servers are all identical, but perhaps one acts as the main load balancer?? If someone can point me in the right direction, or if this simply sounds like one of those Enterprise Cloud's that is now a default setup in Ubuntu Server 9.10+, then I'll go down that route. Cheers in advance. Andy

    Read the article

  • Exchange 2007 relay from sendmail, message "Undelivered". Possible reasons?

    - by garlicman
    Note: This is my re-post from Stackoverflow. I've been messing with a test environment for security purposes where a DMZ RHEL5 sendmail server is used as a relay for an Exchange 2007 server. Exchange is working in the environment, I have Vista and XP VMs using Outlook on the Domain to send e-mail to each other. I've been trying to simulate an external internet VM sending an e-mail to the DMZ sendmail relay, which forwards to the Exchange server. Before everyone thinks this is too big a problem/question, I've followed the sendmail/Exchange guides and all I want to know is how I can determine why a relayed message/e-mail in Exchange is "Undelivered". Basically I send a SMTP message to the sendmail server, which relayed to my Exchange. The /var/log/maillog shows the e-mail being relayed to Exchange. Nov 17 13:41:22 externalmailserver sendmail[9017]: pAHIfMuW009017: from=<[email protected]>, size=1233, class=0, nrcpts=1, msgid=<[email protected]>, proto=ESMTP, daemon=MTA, relay=[10.50.50.1] Nov 17 13:42:17 externalmailserver sendmail[9050]: pAHIfMuW009017: to=<[email protected]>, delay=00:00:55, xdelay=00:00:36, mailer=relay, pri=121233, relay=mailserver.xyz.local. [192.168.1.20], dsn=2.0.0, stat=Sent (<[email protected]> Queued mail for delivery) This is good, but the To never receives the e-mail from Exchange. So I started poking around Exchange. In the "Message Tracking" Troubleshooting Assistant I queried the processed messages and found this: (I had to copy and paste the cells... sorry for the format) 2011/11/17 RECEIVE SMTP <[email protected]> "Undelivered Mail Returned to Sender" [email protected] [email protected] 192.168.100.10 MAILSERVER\DMZ Relay [email protected] I just want to know if anyone has any suggestions on why the DMZ Relay Connector I setup isn't relaying and is instead returning the forwarded e-mail to sender as Undelivered? My Exchange Relay Receive Connector is pretty simple. The Exchange server's FQDN is set as the HELO response, all available IP addresses can receive relayed e-mail, and the IP address of my sendmail server is specifically set as a remote server.

    Read the article

  • sudo or acl or setuid/setgid ?

    - by Xavier Maillard
    Hi, for a reason I do not really understand, everyone wants sudo for all and everything. At work we even have as many entries as there are way to read a logfile (head/tail/cat/more, ...). I think, sudo is defeating here. I'd rather use a mix of setgid/setuid directories and add ACL here and there but I really need to know what are the best practices before starting up. Our servers have %admin, %production, %dba, %users -i.e many groups and many users. Each service (mysql, apache, ...) has its own way to install privileges but members of the %production group must be able to consult configuration file or even log files. There is still the solution to add them into the right groups (mysql...) and set the good permission. But I do not want to usermod all users, I do not want to modify standards permissions since it could change after each upgrade. On the other hand, setting acls and/or mixing setuid/setgid on directories is something I could easily do without "defacing" the standard distribution. What do you think about this ? Taking the mysql example, that would look like this: setfacl d:g:production:rx,d:other::---,g:production:rx,other::--- /var/log/mysql /etc/mysql Do you think this is good practise or should I definetely usermod -G mysql and play with standard permissions system ? Thank you

    Read the article

  • I accidentally hijacked my localhost

    - by Zach L
    Opening localhost in the browser is pointing a local webpage (examplePage) after playing with some config files a while back, and I can't figure out how to restore the default behavior. Background: I have XAMPP installed on my Windows 7 machine, and a webpage at c:/xampp/htdocs/examplePage. A couple weeks ago, I was on a mission to get sites root-relative urls (/resource) to work, so I played around with a bunch of apache/conf files, including httpd.conf and httpd-vhosts.conf and also was messing with the Windows hosts file. I gave up at some point, didn't document exactly what I did, and have since probably forgotten some of what I did. Many of my changes stemmed from suggestions in this StackOverflow post What I've Tried I commented out my additions to the hosts file I turned off XAMPP (thus hopefully negating any apache config file effect) I reverted to my original DocumentRoot in httpd.conf anyway (xampp/htdocs) localhost still displays examplePage. Even with xampp turned on (my reverted DocmentRootisn't taking effect) Does anyone know what I may have done and how I can fix it? Update : Its been resolved, thank everyone so much in taskmanager, theres a couple instances of httpd.exe (Apache HTTP Server). I ended these, and opened XAMPP, restarting apache. all references to examplePage in my .conf files that I could find had been commented out or removed. I imagine that the old versions were still in effect for some reason, and manually ending the Apache processes fixed this. As a point of interest, Its still a mystery why those processes were running - I cannot reproduce that situation. I must've stumbled upon a XAMPP bug of some sort.

    Read the article

< Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >