Search Results

Search found 54055 results on 2163 pages for 'multiple files'.

Page 198/2163 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • rsync error: some files/attrs were not transferred

    - by Daniel Ball
    Using rsync(ubuntu) and a DeltaCopy server on W2K3 to back up some of the data on the file server before I migrate from W2K3 to Ubuntu server. After it completed I ran a dry run just in case something had been missed or changed ... I got the following: sudo rsync -az -n 198.3.9.25::Music /mnt/raid/music [sudo] password for daniel: file has vanished: "?????\#267????" (in Music) file has vanished: "????????" (in Music) ... rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1526) [generator=3.0.7] I just want to make sure I'm reading it right, that somehow there are files on the receiving end that aren't on the sending?

    Read the article

  • hyper-v multiple virtual machines with >2TB volumes from one raid

    - by wurlog
    I have a server with two Raids. Raid 0: 2x 1TB Raid 6: 8x 2TB The first raid I used for the hyper-v installation itself. The virtual machines should use the Raid 6, but how can I config it? I need at least one file server with the most of the disc space (maybe a second). But every vhd has a maximum of 2 TB and I can't use the volume directly because other virtual machines have to have access the Raid6. What do I do?

    Read the article

  • Win 7 accessing large files uses 100% RAM

    - by user181276
    Running Win 7 64-bit SP1 with 8 GB RAM. I first noticed this problem when using the GUI to copy some large (5+ GB) files from one disk to another. What happens is the physical memory in use rises quite quickly to 100% and the system comes to a crawl. If I just start to access the file in a media player (it is a movie) the memory usage climbs up slowly but eventually reaches 100%. When copying the same files via XCOPY I do not have this problem. Using RAMMAP I see most of the memory usage is under "Mapped File" and is allocated under the "Active" column. If I select "Empty System Working Set" the RAM usage drops back down but then starts to climb back up. Any ideas on what I can check/test to eliminate this issue?

    Read the article

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • Multiple gcc on Mac OS X

    - by snihalani
    I did a port install for gcc version 4.7.1 (MacPorts gcc47 4.7.1_2) I named the executable as g+ and placed it in one my $PATH. I use gcc 4.7.1 when I need c++11 standard. I haven't changed the original g++ so as not messup XCode. I am using eclipse-cdt and running the make all from the window. It's giving me: 20:12:40 **** Build of configuration Default for project 2804-hw2 **** make all g+ -c -Wall -std=c++11 main.cpp -o main.o make: g+: No such file or directory make: *** [main.o] Error 1 20:12:40 Build Finished (took 89ms) Here is my makefile CC=g+ CFLAGS=-c -Wall -std=c++11 LDFLAGS= SOURCES=main.cpp Vector3D.cpp OBJECTS=$(SOURCES:.cpp=.o) EXECUTABLE=exec all: $(SOURCES) $(EXECUTABLE) $(EXECUTABLE): $(OBJECTS) $(CC) $(LDFLAGS) $(OBJECTS) -o $@ .cpp.o: $(CC) $(CFLAGS) $< -o $@ clean: rm $(EXECUTABLE) $(OBJECTS) How do I make eclipse detect my g+?

    Read the article

  • How to get average of multiple time series

    - by Supun Kamburugamuva
    I have few computers running. I want to get the average CPU usage of these computers and plot it as a graph. So I've collected CPU usage in regular intervals in these machines. So for each computer I have a data set of time and CPU usage. But the times at which CPU measurements are taken in different machines are not in sync. For example in 1st machine CPU may be measured in time 1, 5, 9. In the second machine CPU may be measured in time 2, 5, 8. I want to get an average data series from these different data sets. Could you point me to some resources? Thanks - Supun.

    Read the article

  • DHCP server with multiple interfaces on ubuntu, destroys default gateway

    - by Henrik Kjus Alstad
    I use Ubuntu, and I have many interfaces. eth0, which is my internet connection, and it gets its info from a DHCP-server totally outisde of my control. I then have eth1,eth2,eth3 and eth4 which I have created a DHCP-server for.(ISC DHCP-Server) It seems to work, and I even get an IP-address from the foreign DHCP-server on the internet facing interface. However, for some reason it seems my gateway for eth0 became screwed after I installed my local DHCP-server for eth1-eth4. (I think so because I got an IP for eth0, and I can ping other stuff on the local network, but I cannot get access to the internet). My eth0-specific info in /etc/network/interfaces: auto lo iface lo inet loopback auto eth0 iface eth0 inet dhcp auto eth1 iface eth1 inet static address 10.0.1.1 netmask 255.255.255.0 network 10.0.1.0 broadcast 10.0.1.255 gateway 10.0.1.1 mtu 8192 auto eth2 iface eth2 inet static address 10.0.2.1 netmask 255.255.255.0 network 10.0.2.0 broadcast 10.0.2.255 gateway 10.0.2.1 mtu 8192 My /etc/default/isc-dhcp-server: INTERFACES="eth1 eth2 eth3 eth4" So why does my local DHCP-server fuck up the gateway for eth0, when I tell it not to listen to eth0? Anyone see the problem or what I can do to fix it? The problem seems indeed to be the gateways. "netstat -nr" gives: 0.0.0.0 --- 10.X.X.X ---- 0.0.0.0 --- UG 0 0 0 eth3 It should have been 0.0.0.0 129.2XX.X.X 0.0.0.0 UG 0 0 0 eth0 So for some reason, my local DHCP-server overrides the gateway I get from the network DHCP. Edit: dhcp.conf looks like this(I included info only for eth1 subnet): ddns-update-style none; not authoritative; subnet 10.0.1.0 netmask 255.255.255.0 { interface eth1; option domain-name "example.org"; option domain-name-servers ns1.example.org, ns2.example.org; default-lease-time 600; max-lease-time 7200; range 10.0.1.10 10.0.1.100; host camera1_1 { hardware ethernet 00:30:53:11:24:6E; fixed-address 10.0.1.10; } host camera2_1 { hardware ethernet 00:30:53:10:16:70; fixed-address 10.0.1.11; } } Also, it seems that the gateway is correctly set if I run "/etc/init.d/networking restart" in a terminal, but that's not helpful for me, I need the correct gateway to be set during startup, and i'd rather find the source of the problem

    Read the article

  • Install compiled linux program on multiple computers

    - by jtnire
    I'm sorry if this sounds like a silly question, but when I compile something on linux using the usual "./configure, make, make install" steps, how can I install the programs on other servers without having to recompile? I am trying to avoid having to install the build tools on production servers, however I need the latest version of a particular piece of software, so using RPMs isn't an option in this case. Any help is appreciated. Thanks

    Read the article

  • Xcode custom project template creates only references to files

    - by user315374
    Hi, I tried to create a custom project template for setting up unit testing. The problem is that when i create a new project based on this template it creates references to the template files : When i edit a file, it changes my template files instead of my actual project files ! When i delete my template, files from my actual project becomes red ! The project template is in : /Developer/Platforms/iPhoneOS.platform/Developer/Library/Xcode/Project Templates/Application/Test-based Application Reading some question on stack overflow i tried to install my template project in /Library/Application Support/Developer/Shared/Xcode/Project Templates But i cannot see my template when i create a new project. Can anybody help ? Thanks, Vincent

    Read the article

  • Web hosting for multiple web sites providing system isolation

    - by Justin
    We have a small number of projects where we expect the client will not be maintaining the installed versions of applications we install to power the site (such as Drupal). Given that an important part of security is keeping things updated, we don't want to host these projects on our Plesk-powered dedicated servers that currently host lots of our other client's websites. Our goal is to find a host where we can deploy isolated instances (be these slices, virtual servers, grid servers, etc) for each individual (or groups of 2-3) web sites as we need them. These instances would be completely separate, so that if one web site were hacked it would not impact any other site. Typical hosting requirements: Linux Apache PHP 5 MySQL Supports Drupal Ability to setup a cron task (but we don't need SSH access) Daily backups Virtualized/cloud hosting (we want to avoid shared) Pricing per site is around $25/month OS is patched automatically Some options we have considered but won't work: MediaTemple: Two major data center-wide security incidents and recent downtime foster doubt about this host's technical ability. Slicehost: This would require us to manage the entire server, which we don't want to do. Rackspace Cloud Sites (formerly Mosso): No backup options. Do you have any recommended hosting options for given these requirements?

    Read the article

  • Multiple nt52 entries in bootmgr

    - by SLaks
    I have a machine with Windows XP, Server 2003 R2, and Server 2008 R2. Right now, bootmgr has one entry for Server 2008 R2 and one entry for ntldr, which then leads to the ntldr boot.ini menu. Is it possible to add two different nt52 entries on two partitions so that I can access all three OSes from the bootmgr menu? Right now, Server 2008 and XP are in logical drives on an extended partition, but (I assume) I can image them onto basic partitions if necessary.

    Read the article

  • Basic video editor for AVCHD Lite files?

    - by davr
    I have a camera (Panasonic Lumix GF-1) that outputs "AVCHD Lite" files, 720p h264 in a MTS container. I saw this question that said Movie Maker in Windows 7 supports AVCHD...but I just tried, and unfortunately it does not support AVCHD Lite. Are there any free or inexpensive non-linear video editors (NLE) that can natively handle AVCHD Lite files, without requiring some 3rd party driver? If not, are there any 3rd party drivers that are especially stable? (In my experience they usually have some problems...I got AVCHD Lite loading into VirtualDub using a 3rd party plugin, but it's very slow and sometimes crashes, and seeking takes ages.

    Read the article

  • Only one domains not resolving via Windows DNS server at multiple locations, but is at others

    - by Brett G
    I'm having quite a weird issue. Had mail delivery issues to a specific domain. After looking closer, I realized that the DNS for that domain isn't resolving via the in-house Windows 2003 SP2 DNS server. C:\>nslookup foodmix.net Server: DC.DOMAIN.com Address: 10.1.1.1 DNS request timed out. timeout was 2 seconds. DNS request timed out. timeout was 2 seconds. *** Request to DC.DOMAIN.com timed-out (DC.DOMAIN.com and 10.1.1.1 are generic values to replace the actual ones) Even if I run this nslookup from the DC.DOMAIN.com server, I get the same result. However, all other requests are working as they should. I tried it on severs at completely separate organizations on different networks(Windows 2003 AD servers). The weird thing is some of these were having the same exact issue. However using public DNS servers work. I have tried clearing the DNS cache, restarting the server, restarting the services, etc. Nothing has worked. One weird event I noticed in the DNS Server Event Logs that might be related is an event ID of 5504 with the following description: The DNS server encountered an invalid domain name in a packet from 192.33.4.12. The packet will be rejected. The event data contains the DNS packet. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. In the data section below, I can see the following mentioned: ns2.webhostingstar.com Which happens to be the nameserver for the domain in question. Several discussion threads and a MS KB have pointed to disabling EDNS. I have done this via "dnscmd /config /enableednsprobes 0" and it has not fixed the issue.

    Read the article

  • Using ServletOutputStream to write very large files in a Java servlet without memory issues

    - by Martin
    I am using IBM Websphere Application Server v6 and Java 1.4 and am trying to write large CSV files to the ServletOutputStream for a user to download. Files are ranging from a 50-750MB at the moment. The smaller files aren't causing too much of a problem but with the larger files it appears that it is being written into the heap which is then causing an OutOfMemory error and bringing down the entire server. These files can only be served out to authenticated users over https which is why I am serving them through a Servlet instead of just sticking them in Apache. The code I am using is (some fluff removed around this): resp.setHeader("Content-length", "" + fileLength); resp.setContentType("application/vnd.ms-excel"); resp.setHeader("Content-Disposition","attachment; filename=\"export.csv\""); FileInputStream inputStream = null; try { inputStream = new FileInputStream(path); byte[] buffer = new byte[1024]; int bytesRead = 0; do { bytesRead = inputStream.read(buffer, offset, buffer.length); resp.getOutputStream().write(buffer, 0, bytesRead); } while (bytesRead == buffer.length); resp.getOutputStream().flush(); } finally { if(inputStream != null) inputStream.close(); } The FileInputStream doesn't seem to be causing a problem as if I write to another file or just remove the write completly the memory usage doesn't appear to be a problem. What I am thinking is that the resp.getOutputStream().write is being stored in memory until the data can be sent through to the client. So the entire file might be read and stored in the resp.getOutputStream() causing my memory issues and crashing! I have tried Buffering these streams and also tried using Channels from java.nio, none of which seems to make any bit of difference to my memory issues. I have also flushed the outputstream once per iteration of the loop and after the loop, which didn't help.

    Read the article

  • LAMP/TURNKEY LINUX/VIRTUAL BOX: Manipulating Files on a Virtual Machine

    - by aeid
    I am running Ubuntu 9.10 and I want to install turnkey linux's LAMP server on my machine to test out my code. I installed Turnkey LAMP via VirtualBox and it seems to be working because I can access the http://localhost. My question is: How do I manipulate files via VirtualBox? For example, if I had installed LAMP on my machine (not on a virtual machine), I could easily add/edit/delete files in the var/WWW folder. Where is the equivalent of "WWW" folder on Virtualbox and how can I interface with it? Thanks,

    Read the article

  • Plesk 10 Postfix with multiple IP adresses and SSL certificates

    - by JulianB
    We are currently running a root server with Debian 6 and Plesk 10.4.4. We have some virtual hosts using one IP adress (shared) - e.g. example1.com - and another virtual host using a dedicated IP address (example2.com). Is there a way to configure postfix to do the following Always use the IP address of the virtual host to which the e-mail account belongs (so that an e-mail from [email protected] will originate from the shared IP-Address and an e-mail from [email protected] will originate from the dedicated IP? Use different certificates for TLS for example1.com and example2.com? If the latter is not possible: Could any problems arrive when using example1.com as certificate for example2.com users? Of course, example2.com users would have to configure their clients to use example1.com as the SMTP server name to avoid annoying security warnings. But if we still would be able to get the effect of the first point that would still be acceptable.

    Read the article

  • Privoxy-like proxy that handles multiple parallel connections?

    - by overtherainbow
    Hello I use Privoxy on my XP host to filter/rewrite web pages, but it's slower because all connections go through Privoxy's single port. According to this post on StackOverflow, by default, browsers support more than one simultaneous connection, which would explain why going through Privoxy is slower. Does someone know of a similar application that could handle more than one connection? Thank you.

    Read the article

  • Multiple Haskell cabal-packages in one directory

    - by aleator
    What is the recommended way of having several cabal packages in one directory? Why: I have an old project with many separable modules. Since originally they formed just one program it was, and still is, handy to have them in same directory for easy compiling. Options Just suffer and split everything, including VCS holding the stuff, into different directories? Hack cabal until it is happy with multiple .cabal files in same directory? Make another subdirectory for each module and put .cabal files there along with symlinks to original pieces of code? Something smarter? What?

    Read the article

  • Getting XAMPP to work with multiple version of PHP

    - by Pennf0lio
    How can I install XAMPP to work with different versions of PHP? I use XAMPP because some of the scripts are buggy when run in WAMP. I use WAMP because it supports different versions of PHP. But now I would like to streamline it down to just XAMPP so that my web development would be easier to manage. Is it possible to configure XAMPP to work with more than one version of PHP? Or is it something that I have to look for in an alternative solution? Note: I'm running on Windows 7.

    Read the article

  • How to set up multiple DNS servers on an intranet

    - by Brent
    We have an Active Directory network, with a mixture of Windows DNS, linux BIND servers, and want to use OpenDNS as our external DNS provider. I am wondering What is the best way to set up these servers (regarding forwarders, recursion, etc.)? Active Directory is our main internal DNS for our domain, and has 3 redundant servers. DHCP and all our servers use these as their DNS servers. Then we have a legacy AD server from an old network that is still authoritative for a bunch of domains. Finally, we have a couple of Linux Bind servers that are authoritative for a bunch of websites we host. Should our main AD servers point to our legacy AD server, which points to one of our BIND servers, which points to the other BIND server, which finally points out to openDNS? Or should our main AD servers point to all of these directly? - or is there a better option? What happens if a domain is listed in 2 places? Does DNS process the forwarders in order? What about root servers - if I want to use OpenDNS for "everything else", do I just list them as the last forwarders, and delete the root servers from all my DNS servers? How does recursion work - in this scenario, should I be using recursion or not?

    Read the article

  • Running Magento for multiple clients - single Installaton vs. multiple installations

    - by Chris Hopkins
    Hi There I am looking to set-up a Magento (Community Edition) installation for multiple clients and have researched the matter for a few days now. I can see that the Enterprise Edition has what I need in it, but surprisingly I am not willing to shell out the $12,000 odd yearly subscription. It seems there are a few options available to be but I am worried about the performance I will get out of the various options. Option 1) Single install using AITOC advanced permissions module So this is really what I am after; one installation so that I can update my core files all at the same time and also manage all my store users from one place. The problems here are that I don't know anything about the reliability of this extra product and that I have to pay a bit extra. I am also worried that if I have 10 stores running off this one installation it might all slow down so much and keel over as I have heard allot about Magento's slowness. Module Link: http://www.aitoc.com/en/magentomods_advanced_permissions.html Option 2) Multiple installations of Magento on one server for each shop So here I have 10 Magento installations on one server all running happily away not using any extra money, but I now have 10 separate stores to update and maintain which could be annoying. Also I haven't been able to find a whole lot of other people using this method and when I have they are usually asking how to stop their servers from dying. So this route seems like it could be even worse on my server as I will have more going on on my server but if my server could take it each Magento installation would be simpler and less likely to slow down due to each one having to run 10 shops on its own? Option 3) Use lots of servers and lots of Magento installations I just so do not want to do this. Option 4) Buy Magento Enterprise I do not have the money to do this. So which route is less likely to blow up my server? And does anyone have experience with this holy grail of a module? Thanks for reading and thanks in advance for any help - Chris Hopkins

    Read the article

  • MySQL Backup: Can I copying individual MyISAM table files to another server with different MySQL ver

    - by Komputer
    I means copying individual MyISAM table files is: (shut down mysqld and copy the .frm, .myd, and .myi files from one database folder to another) Question: (a) can I use this way to backup MySQL database folder from one server to another server with different MySQL version? (b) can this backup files moved to different OS? (example: debian to centos) thanks in advance.

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >