Search Results

Search found 16329 results on 654 pages for 'b long'.

Page 451/654 | < Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >

  • Site to Site VPN problem, connection succesful data only oneway?

    - by Charles
    To start things off, I'm not the actual Administrator for the VPN Server, but he is also at a loss so I thought I'd ask it here. I know it's a Cisco ASA Firewall/VPN. I have a router that connects to the Cisco VPN server, it does so succesfully. I can ping everything within the remote network and from the remote network into my own. I've been able to SSH into a remote server over VPN as well, it all seems to work; until there's some more data returned. A quick example would be an internal webserver. The default homepage simply redirects, so only sends back HTTP headers with a "Location:". I receive this on my computer, but when I request the actual page then (which isn't that big) I don't get a response at all - it just stalls. And it does this for other services as well, for example SSH. I can do a couple of things while connected, but if there's more than xx output it seems to do nothing. The connection remains active throughout all of this. Has anyone ever experienced anything like this before / know what the problem might be? Another user who has a site-to-site connection with this VPN using the -exact same setup- has no problems, the only difference is that I have around 200ms ping to the VPN server/network because of a very long distance (other continent).

    Read the article

  • 403 Forbidden When Using AuthzSVNAccessFile

    - by David Osborn
    I've had a nicely functioning svn server running on windows that uses Apache for access. In the original setup every user had access to all repositories, but I recently needed the ability to grant a user only access to one repository. I uncommented the AuthzSVNAccessFile line in my httpd.conf file and pointed it to an accessfile and setup the access file, but I get a 403 Forbidden when I go to mydomain.com/svn . If I recomment out this line then things work again. I also made sure I uncommented the LoadModule authz_svn_module and verified that it was point to the correct file. Below is the Location section of my httpd.conf and my svnaccessfile httpd.conf (location section only) <Location /svn> DAV svn SVNParentPath C:\svn SVNListParentPath on AuthType Basic AuthName "Subversion repositories" AuthUserFile passwd Require valid-user AuthzSVNAccessFile svnaccessfile </Location> (I want a more complex policy in the long run but just did this to test the file out) svnaccessfile [svn:/] * = rw I have also tried just the below for the svnaccessfile. [/] * = rw I also restart the service after each change just to make sure it is taken.

    Read the article

  • Setup a new domain controller over a temporary VPN, but now Windows delays startup?

    - by Kris Anderson
    I'm migrating servers from colo locations to Amazon's VPC EC2 instances. If anyone hasn't worked with Amazon VPC before, VPN is a pain in the arse! Anyways, I setup a new server that acts as the domain controller for our Amazon VPC. In order to migrate all the user accounts from our existing domain controllers I manually connected to our colo VPN using my user account on the new Amazon EC2 machine. I was able to join the domain and the new Amazon server became another domain controller on our network. So far so good. The problem I'm having is that when booting the EC2 domain controller (which is no longer connected to the VPN so it can't communicate with the existing controllers), it takes a good 6-8 minuted before I can remote into the server (instead of the 1-2 minutes it should take). Also, during this time most of the services we also run (like IIS) also give 404 errors until the 6-8 minutes have passed. It's almost like the domain controller is attempting to reach the other domain controllers first and after 6-8 minutes it falls back to the one located on the local machine? I don't think that's what's happening though, because Server 2008 R2 doesn't have primary and backup domain controllers. They're all equal as far as Windows is concerned. For my network adapter I have only one DNS listed, 127.0.0.1, so it should be looking up the local domain controller and not the other domain controllers it connected to over VPN when VPN was enabled. In the server logs I'm seeing these warnings pop up during a reboot: The winlogon notification subscriber is taking long time to handle the notification event (CreateSession). The winlogon notification subscriber took 409 second(s) to handle the notification event (CreateSession). Any ideas on what's happening here? I would try removing the existing domain controllers from the new Amazon EC2 machine, but I still need to connect over VPN a few times to migrate some data between the servers, and I don't want that change being reflected back to the other domain controllers in our colo locations.

    Read the article

  • Firefox does not upgrade properly (infinite installation loop without success)

    - by Mehper C. Palavuzlar
    I've been using Firefox for a long time but I have a problem with upgrading process now. As you know, Firefox automatically checks for the latest version and downloads it, and when it is ready to be installed it starts the installation process as usual. Last month, when I clicked on "OK, install v3.6.2", the process started but couldn't be finished properly. FF gave a message like that: "Installation cannot be completed, please restart Firefox to retry". When I restarted FF, the same thing happened, and the process entered in an infinite loop. I restarted my PC and clicked on FF icon, and again the installation process began and went into the infinite loop. My solution to this problem was to download FF 3.6.2 and manually install it after killing firefox.exe. Today, FF proposed me to install v3.6.3 and again the same problem occured. I manually downloaded FF 3.6.3 and installed it successfully. Anyone experiencing this problem? In case you want to know, I did not install any new addons in the last few months, and my FF operates without problems. I'm using Windows 7 Home Pro x64.

    Read the article

  • How to copy a bunch of pages? Is there a 3rd party tool?

    - by unknown (yahoo)
    (I asked the following question at the DNN forum, and also at snowcovered. Nobody knew of such an obvious time-saver being for sale. I'm posting here in case anybody knows of a freeware module that might do this.) By "groups of dnn pages", I mean pages that form a hierarchy (not necessary a hierarchy that is headed with a page at the same level as the Home page.) I know that I can copy web pages, one by one, using the admin login via the web-based dnn interface. But, I'd prefer a script or wizard, of some sort (that runs scripts behind the scenes) that can allow me to 1) specify a web page that I want to copy (along with the hierarchy of pages under it) 2) specify the names and titles of the new top-level pages 3) specify whether the contained modules of the top-level page that I want to copy is to be : ( ) New ( ) Copy ( ) Reference (as in the web-based interface) 4) repeat 3) for each of the source pages in the hierarchy that I want to copy You might say that I am looking to do something similar to creating a portal web site based on a template, except that it's not an entirely new website - instead it's a section of the current web site. I might want to do this because I have an organization which is broken into chapters, and I want each chapter to have, say, it's own General Information page (which acts like it's home page), and underneath that, in it's hierarchy, a Contact Info page and an Events page. so: Home Page   General Information Page     Contact Info     Events -- Home Page   General Information Page     Contact Info     Events   General Information Page Kiwanis - Bloomfield     Contact Info     Events   General Information Page Kiwanis - Dayton     Contact Info     Events If I have 200 chapters, I certainly don't want to copy those 3 web pages using the web based interface, as that would take a long time. (And imagine if each chapter's new sub-website had 30 pages!) I just want to specify the parameters of a copy process, then press a button, and let the system do the rest.

    Read the article

  • Windows, never "lose focus" of the current window

    - by Mazura
    I want the task bar to light up orange and never lose focus to anything. If I'm installing something and then go play a game at some point it will drop out to the finished instillation. Also, if installing multiple programs at once my 'next' button can all of sudden become "click here to install this crappy toolbar" of another program's install. Of course there are settings for some programs to not "lose focus" or "stay on top" but I really want windows to handle it. If its somehow an exe called say, Taskswtich.exe, I could possibly use Process Blocker however I'm assuming its part of a function call or some such. For XP I found this: How to disable auto focus of opened Windows applications? but what about Windows7? And this old post Preventing applications from stealing focus with a bunch of long answers that say "no". I'd appreciate this not being merged with a 4 year old question. I'd like to avoid 3rd party software. This is 2014, don't we know how to hack windows yet?

    Read the article

  • windows 8.1 does not boot after CHKDSK command

    - by sepehr
    I had problem with Windows 8.1 high Disk usage so in order to solve , searched forums. A voted answer suggested to use CHKDSK command as follow: run command prompt and as administrator type code snippet below: CHKDSK /f /v /b C: (I can not remember accurately) CHKDSK printed : can not run CHKDSK right now , would you like to schedule C drive to be checked next time windows starts? (Y/N) my response was "Y" ,please ! Following this suggestion not only didn't solve my problem as expected but also added another one ! The next time system booted, after windows authentication I just could see a black screen and mouse pointer. So I force shut downed the system and tried to start windows again. This time , windows got stuck in Scanning and Repairing on 22% for around 3 hours so I got tired and forced shut downed again. is CHKDSK source of Scanning and Repairing problem or they are discrete ? is there any hope to overcome this problem without re-installing windows ? can any one else run CHKDSK on newer versions of windows without problem? is CHKDSK effective but inefficient ?(it finally get to end but will take a long time) If yes, how much time does it take? I also have Linux Ubuntu 14.04 installed along side windows.

    Read the article

  • How do I connect a 2008 server to a 2003 server active directory?

    - by Matt
    Our DC is running Windows Server 2003. I've just set up Windows Server 2008 and have terminal server running on it. When setting the terminal server permissions, it was able to allow a group name that was read from the domain. In the DC the new terminal server shows up as a computer in the domain. I can also log in as a user within the domain even though that user doesn't exist locally on the new server. However, when I go to set sharing permissions on the new machine it doesn't show my domain as a location. Instead it is only looking at location "machinename" and not allowing domain to be seen or added. Is there something I'm missing? Ok, lots of errors in the event log. We have this: The winlogon notification subscriber is taking long time to handle the notification event (Logon). Followed by this: The winlogon notification subscriber took 121 second(s) to handle the notification event (Logon). Followed by: The processing of Group Policy failed because of lack of network connectivity to a domain controller. This may be a transient condition. A success message would be generated once the machine gets connected to the domain controller and Group Policy has succesfully processed. If you do not see a success message for several hours, then contact your administrator. I think this might be the same problem I'm having http://serverfault.com/questions/24420/primary-domain-controller-slow Solved. The issue was that I had changed from DHCP to static and put the wrong DNS server IP in. i.e. firewall instead of DC/DNS server.

    Read the article

  • Ubuntu root privs installation issue

    - by Pam
    I am a fairly new Ubuntu user (and Linux user, for that matter) and I just downloaded a program whose installer was a .sh file. Not thinking, I copied the installer to an /opt subdirectory, thinking that I was going to install the application there: sudo cp ~/Downloads/fooInstaller.sh /opt/someDir I can't remember, but I either had to use sudo because /opt required it, or I just used it without thinking, but in any case, I prefixed with sudo. Once in /opt/someDir, I executed the installer again, using sudo: sudo sh fooInstaller.sh The terminal went crazy, and a few seconds later, a graphical install wizard popped up that guided me through the rest of the process. At the end of the wizard I was prompted to launch the program, and I did, and everything was great. Until... I closed the program, and attempted to add it to my Ubuntu "panel" (the icon panel at the top of the screen). The program was installed to /usr/local/foo/theProgram, and so I specified that URL as the command in the custom app launcher. When I open the program through the panel/launcher (at the top of the screen), the program doesn't load or operate correctly. I get a lot of error messages complaining about being denied permissions. I'm assuming that this is a "superuser/installation/privs" issue, and not a problem with the application (hence this post at superuser.com instead of the application's forums), because when I launch the program from the terminal with sudo, it opens and executes perfectly fine, just like it did the first time around after the install wizard finished. I realize I'm probably going to have to uninstall the program completely, and re-install it differently. Finally, my question: After uninstalling, can I avoid all these issue by just running the installer (sh fooInstaller.sh) right out of my Downloads directory, sans the sudo prefix? If not, how do I get the program to install without root privs so that I can add it to my panel/launcher and get it executing correctly? Sorry for the long post but I didn't want to omit any details because, as I'm sure you can tell, I'm not really sure I know what I'm doing. Thanks for any help here!

    Read the article

  • transparently proxying a firewalled web application from a non-standard port to port 80

    - by Terrence Brannon
    I have a web application that serves on port 8088 on $server. However, the only port accessible from remote on $server is port 80. Furthermore, only CGI programs can execute on port 80. I would like to write a CGI program accessible via port 80 that allows one to use the web app running on port 8088. From my view, an ideal solution would be some sort of Java web browser that simply opened up a window and allowed me to use the program running on that port. The CGI program would simply initiate a web browser applet or something. I wrote a Perl CGI program that does it, but I really would like a more transparent solution: my $q = new CGI; print $q->header; use LWP::Simple; use HTML::Tree; my $base = "http://localhost:8088"; my $request = $base; my $qurl = $q->param('url'); if (length($qurl) > 1) { warn "long $qurl"; $request = "$base$qurl"; } else { warn "short $qurl"; } my $content = get($request); my $tree = HTML::TreeBuilder->new_from_content($content); my @a = $tree->look_down('_tag' => 'a'); for my $a (@a) { my $url = $a->attr('href'); next if index($url, '#') > -1 ; $url = "?url=$url"; $a->attr(href => $url); } print $tree->as_HTML;

    Read the article

  • Seeking web-based FTP client for very large file upload

    - by Paul M. Nguyen
    I have looked around for these for some time... the limits imposed by the web server and/or the dynamic programming environment (e.g. PHP) are far too restrictive for the application I'm working on. We need to be able to move large graphics and video files to and from clients (ranging from tens of MB to a few GB in a single file). Plain FTP with a proper desktop client will do the trick, and we're hosting this in Amazon EC2 with EBS. User management will be done from the office via webmin. Users are chroot-jailed into their home dir by proftpd. net2ftp will work for many clients, but we often need to move single files that approach 1GB or exceed 2-3GB which is way out of the range of any http-based uploader. So we turn to Java or Flash - can they do it? From within the web browser establish an FTP connection and grab a huge file? There are licensed applets and such out there, but none seem convincing. Again, I'm looking for some code that can speak FTP and read (& write?) the local disk, that is delivered in a web browser, and can move single files of 2GB+. The reason for having a web-based interface to FTP is to skip the software installation step for our clients. I will consider proper desktop client software as long as it's "portable" and at least Win+Mac and can be easily configured by lay-man users in a hurry.

    Read the article

  • Port Forwarding(?) TD-W8961nd

    - by rich
    I have a bit of a weird internet setup. I am connected via a decent WiFi connection (from work) which I pick up using a Buffalo Airstation Wireless-G box. This simply picks up the signal and gives me 4 ethernet ports to connect to. That's all fine and works as it should. I also have a TP LINK TD-W8961nd router which used to be connected to the Airstation via an ethernet cable so I could essentially have WiFi access in my house. To cut a long story short I can't remember how the hell I got it to work and I can't find the notes I scribbled down on how to do it. I'm pretty sure I need to tell the router what ip to pick up the internet connection from and have the local wifi as a seperate network. How the hell I do that I have no idea right now. Can anyone give me some advice on this? If you need more information ask and I will be able to do so. Cheers in advance. edit I'm in work at the moment so I can't give 100% details but I will be able to later on.

    Read the article

  • Outlook 2007 font sizes

    - by Flack
    Hello, Something really strange seems to have happened to my Outlook 2007. Everything was working fine for a long time now but at the end of today, all of a sudden, all of the fonts in Outlook are messed up. The font size of mails I write is huge (I am not zoomed in) and the font sizes of the buttons are big too, specifically the "Send", "To", and "Cc" buttons. I tried changing the font sizes through Outlook, but some of the buttons on the "Mail Format" tab in Options are not working, mainly the "Stationary and Fonts" button. I hit it but no window opens. This is all happening on my x64 machine. I took a look at my x32 machine, which also has Outlook 2007 installed and everything is ok there. Below is a link to an image comparing the broken, large font Outlook (top of picture) and the normal, working outlook. The text in the mails I compose is also abnormally large in the broken Outlook. Big font Outlook buttons. Any ideas? This came out of no where after a few months of no problems. Thanks.

    Read the article

  • 100% iowait + drive faults in dmesg

    - by w00t
    Hi, I have a server on which resides a fairly visited web app. It has a raid1 of 2 HDDs, 64MB Buffer, 7200 RPM. Today it started throwing out errors like: kernel: ata2.00: exception Emask 0x0 SAct 0x0 SErr 0x0 action 0x6 frozen kernel: ata2.00: cmd b0/d0:01:00:4f:c2/00:00:00:00:00/00 tag 0 pio 512 in kernel: res 40/00:00:00:4f:c2/00:00:00:00:00/00 Emask 0x4 (timeout) kernel: ata2.00: status: { DRDY } kernel: ata2: hard resetting link kernel: ata2: SATA link up 3.0 Gbps (SStatus 123 SControl 300) kernel: ata2.00: max_sectors limited to 256 for NCQ kernel: ata2.00: max_sectors limited to 256 for NCQ kernel: ata2.00: configured for UDMA/133 kernel: sd 1:0:0:0: timing out command, waited 7s kernel: ata2: EH complete kernel: SCSI device sda: 976773168 512-byte hdwr sectors (500108 MB) kernel: sda: Write Protect is off kernel: SCSI device sda: drive cache: write back All day long it has been in load higher than 10-15. I am monitoring it with atop and it gives some bizarre readings: DSK | sda | busy 100% | read 2 | write 208 | KiB/r 16 | KiB/w 32 | MBr/s 0.00 | MBw/s 0.65 | avq 86.17 | avio 47.6 ms | DSK | sdb | busy 1% | read 10 | write 117 | KiB/r 17 | KiB/w 5 | MBr/s 0.02 | MBw/s 0.07 | avq 4.86 | avio 1.04 ms | I frankly don't understand why only sda is taking all the hit. I do have one process that is constantly writing with 1-2megs but what the hell.. 100% iowait?

    Read the article

  • Very strange networking problem in all computers in my house

    - by Anthony
    I have three computers in my house: One desktop (wired), and two laptops (wireless). I'm using Cox Communications (yes they suck), and yesterday they had a major outage. I know it was them because I called them up when I started losing connection to the internet. All the computers can connect just fine, but they don't have internet access. It just says "local only". The weird thing is, some of them work occasionally. For the first day my laptop was working perfectly, while all the other computers couldn't connect. Later on in the day it got reversed, and the desktop was the one with internet access. By the second day the problem on Cox's end was fixed, but we still had no access. I called them up and they reset my modem, and did the usual troubleshooting stuff. It never fixed the problem, but we found out that the problem had to do with conflicting IP addresses. My router was a Linksys WRT54G and it was about 5 years old. I figured it might have gotten damaged from the outage since it was so old, and now it's having trouble "fixing itself" and giving out the proper IP addresses. So I bought a new router, a Cisco Linksys E1000. I set everything up, and still the same problem. My computer has access right now (that's how I'm writing this), but no other computers seem to be able to get access. Is there possible damage to the modem? Can someone help me please? Sorry for this being so long.

    Read the article

  • Refresh file access time under Linux / Discard disk read cache

    - by calandoa
    I am making use of the access time to analyse some build process, but it is not working the way I want: the access time is updated the first time I read the file, then it stays the same for a long while, or until the next reboot. For instance: $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 10:03 some_file $ grep abcdef some_file $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 11:24 some_file # The access time is updated # waiting a few minutes... $ grep abcdef some_file $ ll -u some_file -rw-r--r-- 1 root root 1.3M 2010-04-07 11:24 some_file # The access time has not been updated :( I suppose that the file is buffered by Linux in the free memory, the only this copy is accessed the subsequent times for speed reasons. A solution would be to discard the buffers in memory. After searching some forums, I found: sync echo 1 > /proc/sys/vm/drop_caches echo 2 > /proc/sys/vm/drop_caches echo 3 > /proc/sys/vm/drop_caches But it is not working, it seems that it only sync up the write buffers, not the read ones. May be it is due to some custom kernel configuration on my distro (fedora 9)? Or I am missing something here? Is there a way to achieve this access time refresh? Note also that I do not want to simulate some writes on my entire file tree. Because I am using some makefile based build system, this will cause the entire project to be build again.

    Read the article

  • What Wireless Router/ADSL Modem to get? N-band a must!!

    - by JJarava
    I'm looking for a Dual-N band Router OR ADSL Gateway and I'd like some recommendations. Situation: I have a 802.11b/g ADSL gateway provided by my telco, but the WIFI signal won't cover all the house (especially the living-room, so my tv-connected Mac Mini has poor to no internet access). So I'm looking to either replace the DSL modem with a N-enabled one, or to add a Router to the mix. I've had a modem+router setup for many years, and I know the advantatges (double NAT, double FW = more security) and issues (more complex to troubleshoot, two possible points of failure), so I'd rather live with a single (ADSL Gateway) device, if possible. Requirements: Dual-N Band (300 Mbs WIFI) 1 GB Ethernet ports ADSL2+ support (if it's a ADSL gateway, which would be desirable) "Best" range and speed possible Nice to have: USB port to share disks/printers on the network Media streaming I've been a long time user of Linksys, so googling around I found the WRT610N (http://www.linksysbycisco.com/US/en/products/WRT610N) for a "Pure Router" perspective, and it's one of those that Linksys styles "N++" (http://www.linksysbycisco.com/US/en/promo/Promotion-Go-Wireless?stepname=Promotion-Step-Go-Wireless-High-Performance) But I haven't been able to find similar "ADSL" gateways. I've found the WAG320N, but there is little to no info in the Linksys site (i.e., i don't know if it's Dual Band, or if it has GB ethernet) Any opinions/recommendations of other products/suggestions are more than welcome.

    Read the article

  • Should windows services be created with custom users, or should I use one of LocalSystem/LocalServic

    - by Justin Dearing
    I'm asking the question in general for the average custom developed NT service or unix OSS daemon ported to windows with SCM support. However, at the moment my immediate concern is for mongodb. From my experience with UNIX I like all my services to run as different unprivileged users. The way this has translated to windows is as follows: Create a local (or domain if it has to talk to SQL server) windows user with a long random password (lately an ASCII85 encoded guid generated from a different machine). Set it to next expire and forbid it from changing its password. Remove that user from the "Users Group". Grant that user "Login as a Service" permission. Give it read permission to the folder where the app resides, and write permission to the logs and data files the applications use. Assign the user to the service. Troubleshoot until the service starts. My feeling is that the unprivileged users are less powerful than the 3 special service users. I also feel that by isolating which users run which services, I would limit collateral damage if a way to compromise one service was found.

    Read the article

  • Mod disk_cache permanent caching images and disabling reacurring header updates

    - by user135532
    I am trying to get mod disk_cache to permantly cache images retrieved from an image server on the webserver using ProxyPass. While the image is being retrieved correctly from the server and is served from the cache on further requests, then I am still having the webserver call the image server and causing the cached header to be updated. Because of load concerns then I need to never call the image server on a specific url again after it has been cached once, or extend the refresh time for as long as possible. The webserver is IHS 7.0 The mod's are mod_disk_cache.so, mod_cache.so, mod_proxy.so Version 2.2.8.0 Following is from my httpd.conf: ProxyPass /webserver/media/images/ http://imageserver.com/ws/media/images/ # Caching pictures <IfModule mod_cache.c> <IfModule mod_disk_cache.c> CacheDefaultExpire 2628000 #CacheDisable CacheEnable disk /webserver/media/images/ CacheIgnoreCacheControl On CacheIgnoreHeaders Cookie Referer User-Agent X-Forwarded-For X-Forwarded-Host X-Forwarded-Server Accept-Language Accept Host CacheIgnoreNoLastMod On CacheIgnoreQueryString Off #CacheIgnoreURLSessionIdentifiers CacheLastModifiedFactor 10000000.1 #CacheLock on #CacheLockMaxAge 5 #CacheLockPath CacheMaxExpire 1576800 CacheStoreNoStore On CacheStorePrivate On CacheDirLength 2 CacheDirLevels 3 CacheMaxFileSize 1000000 CacheMinFileSize 1 CacheRoot c:/cacheroot2 </IfModule> </IfModule>

    Read the article

  • Slow File Copy observed copying 40GB files across network to iSCSI device

    - by Rick
    Here's a curious ones for the gurus: Setup: Source Machine: Windows Server 2003 R2 machine with local hard drive. VHD file of 40GB. 1 x 1Gbps network card, Cat6 cable, switch. Target Machine: Windows Server 2008 R2 machine with iSCSI connection to iSCSI target on separate machine (1TB, RAID5). 1 x 1Gbps network card, Cat6 cable, connected to same switch as for Source Machine. Second 1Gbps network card, Cat6 cable, connected via isolated switch to the iSCSI target. Switches are Netgear JGS524 model (web managed). If I copy from the Win2003R2 machine to Win2008R2 machine local drive I get 40GB in 45 minutes, 36 seconds. If I copy from the Win2008R2 machine to the iSCSI target (local drive to iSCSI target) I get 40GB in 37 minutes 56 seconds. If I copy from the Win2003R2 machine to the iSCSI target via the Win2008R2 machine I get 40GB in 3 hours, 50 minutes, 24 seconds. All copies were done via the following command issued on the Win2008R2 box: XCOPY <source> <target> /J XCOPY /J - Copies using unbuffered I/O. Recommended for very large files. So, what's the bit I'm missing here? Why does a back-to-back copy take in total 1 hour, 23 minutes, 32 seconds when a "straight through" copy take almost 3 times as long? Switches show no errors, network hovers around the 3% utilisation mark for the duration of the copy (whereas the "back-to-back" copies are around the 25% utilisation mark). What have I missed?

    Read the article

  • Acer recovering windows vista

    - by Charlie Pigarelli
    My computer history is very long even of my computer has 4 years of life. An year ago I installed Windows 7 on this acer m1610 that had Vista before. My technic left me 2 recovery disc for "acer vista" before updating it to Windows 7. Then the computer had some trouble. The graphic card broke and we decided to use another computer. Yesterday I had the great idea to fuse the two computer to have a better one... So I moved the graphic card of the latter computer to the acer and everything gone well. Then the trouble of speed, it had before, come back. So I decided to reinstall the very first Windows: Vista back again. I booted the computer with those 2 DVD-R my technic left me and at the end of the process it asked me to insert "the backup cd number one or the system disk". I found 2 original Acer "Blank Recovery Disc" DVD-R and tried with those: rejected. Tried with empty DVD-+R: rejected. I tried with CDs: rejected. I don't have any system disc with me. Except for those 2 DVD-R my technic left me. What am I supposed to do now? I even tried with these fantomatic alt+f9/f10 that should start the recovery without any disc... But nothing happened. PS: the installation cannot complete if I do not insert the right disk. (The recovery disc uses Acer eRecovery Management as recovery software.

    Read the article

  • APACHE - PHP - Bounce emails

    - by user1179459
    I want to improve our mail server lists by handling all the bounces we get in our websites. I have a website which has over 8000+ users and another website which as over 1500+ users, they are emailed various notifications every second, ie. job alerts, email alerts, I am using POP connections with EXIM on APACHE server, most scripts are based on PHP language generates email on the fly. Problems i have But some users has registered long time ago and now quite few has bounce email address Some users register with dummy emails like [email protected] which never existed but a valid email address, any chance of stopping this without asking them login to the email account and clicking links which dont work most times.. (too annoying to the end user) Server is sending unnecessary emails can be avoided if i know they dont exists ? Solutions i need to have Is there a way i can download the bounce email list somewhere (WHM/Cpanel), i know exim mail has it but its not readable (i need a file like CSV or something similar to scan them over and write a php script to delete the users from the database ?) I need to know if there is any function in the PHP that can check the existence of the email address on the fly ? so that i can set the email send function in the mailer class to check before it sends out. On the server will bouncing emails are going to eatup lots of server resources ? like memory/cpu on processing them ? or are they are minimal where we dont have to worry about this at all ? May be a opensoruce or linux software to capture them and view them in a report basis and cleaning them up ? I am not a linux expert or server admin but i do lot of PHP coding, so please be descriptive of the solutions specially if they are linux commands Thank you!

    Read the article

  • Will restoring a cPanel account delete irrelevant mysql databases?

    - by user54625
    Long story short: I have many accounts in WHM, many domains. However, most of the MySQL databases for all the sites were created under one user, "admin" (let's say it's admin.com). I recently moved the admin.com account to another web server , and now I need to move it back. I would like to restore this account from a backup made by that web server. However, it only has 3 databases attached to that domain, while the server I'm moving it to has those 3 databases (old DBs that need to be replaced), PLUS all the other databases which are used by other accounts on the server. My question: if I restore the admin.com account, will it delete all the other databases on the account, or will it only replace the 3 DBs it has? I can't take any chances here to delete all those other DBs... If it WOULD delete all those other accounts, how would I go about restoring this account without losing the other data? An easy to way to resolve this situation would be to move the mysql owners to the proper accounts, but alas it doesn't seem that there is a way to do that via WHM.

    Read the article

  • Software mirroring (RAID1) versus "Fake Raid" for new Windows 7 install

    - by kquinn
    I've just ordered two new hard drives for my main desktop and a copy of Windows 7 Professional 64-bit. I'd like to do a clean install of Win7 onto the new drives (leaving my old XP Pro boot partition around for a while in case something goes disastrously wrong, etc.). I want to have them set up in mirrored (RAID-1) mode. My understanding is that Win7 Pro can do software mirroring, but can I set this up directly at install time? If so, how? Note that I'd like the disk to be split into three partitions (OS/Apps&Data/Bulk data), all of which should be mirrored. Would it be better (more reliable or faster) to use my motherboard's hardware RAID support? My motherboard is an older nVidia nForce 680i SLI, which is not the most stable of motherboards, and I'm not sure how trustworthy its RAID1 configuration might be (or if Win7 could even detect and install onto a hardware-mirrored volume). Also, the performance characteristics of RAID1 are rather different than RAID0 or RAID5, and I'm wondering if Win7's software mirroring might actually be faster than hardware RAID1 (for example, I'm more of a Unix admin when I have to wear the sysadmin hat, and I've had great success deploying ZFS; most hardware RAID1 implementations have to read both disks and compare results to look for data errors, but ZFS can read from only one disk in the mirror and just use the built-in checksum, meaning it can have up to 2x the number of reads in-flight, as long as there's no data corruption). Edit: Okay, my question about whether Windows 7 can do software mirroring has been answered, and it can. I'm still unsure whether Windows software RAID or my motherboard's hardware "fake RAID" function is a better choice, though. Remember, I'm only interested in mirroring -- not the more complicated striping or parity operations that generally show the poor performance of crappy motherboard RAID solutions.

    Read the article

  • How does the "Steam" platform work? Is it DRM? Can I trust "Steam"-powered software? [closed]

    - by Chris W. Rea
    So – I just bought the new game Supreme Commander 2. This question is not about the game, but about the online software installation platform that it seems to require. I haven't bought a game in a long time, and I'm puzzled: Apparently, SC2 is a "Steam"-powered game. When I went to install the game, it asked me to either create a new Steam account, or log in with an existing account. I clicked "Cancel" because I don't plan to play online and I don't want anything unnecessary installed on my computer, since I only plan to play single player! However, after clicking "Cancel", the installer asked for my confirmation that I indeed wanted to cancel installation of the game! I thought I was just canceling the "online" portions! So I really want to know: How do "Steam" powered games work? Is this essentially a form of DRM (Digital Rights Management)? Can I trust this software platform? Has anybody done any independent verification on how this platform works? (I'm very leery of any DRM after the Sony BMG CD copy protection scandal. Thank goodness for Mark Russinovich.) Does the "Steam" platform install anything particularly nasty or unwanted on my computer? High-rep users: Please vote to reopen this question. It is not about the game, but about the software update platform / updater / DRM. Imagine if the software in question were a productivity application. The issues remain the same.

    Read the article

< Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >